What is the Frame Rate of the Human Eye?
HTML-код
- Опубликовано: 30 сен 2024
- Eventually the frame rate discussion turns to what the human eye can see. And that conversation is full of myths and misunderstandings. For those daring few people who read the description blurb, here it is, the human eye HAS NO FRAME RATE!!! That's it, you can stop watching the video now.
HOWEVER...
There are some interesting biological phenomenon that can be categorized as "frame rate of the human eye" if you're willing to abuse the terminology. And one of those numbers is 10 Frames Per Second!
Check out Wired's Video on Wagon Wheel Effect with Dr. Eagleman for a longer discussion:
• Why Your Brain Thinks ...
#FrameRate, #Biology #
In other words, we don't see reality. Our brain makes it all up!
not really
Close, but what we see is a recreation of reality by our brains.
I like and agree with your conclusion - The human eye HAS NO FRAME RATE.
I think debating how many frames the human eye can see is counter productive because that's not how eyes work. Instead a better question to ask is - what FPS is needed to completely fool our eyes into 'thinking' that they're looking at reality?
Now that's a completely different story.
Take the cursor on a screen example - at 60Hz we saw 2.5 discrete cursors, at 144Hz we saw 6. But if the cursor was a real object neither our eye or a camera would see multiple discrete cursors - instead we would see a continuous smear. To simulate that effect with a screen we would need a refresh rate so high, that the moving cursor would still overlap a significant amount between refreshes. Again, we're looking for a way to fool the eyes into 'thinking' they're looking at real moving object instead of apparent motion. Though the cursor is a very extreme example and would likely need at least 1000Hz refresh rate to appear continuous (this strongly depends on the relative size of the cursor and it's speed - the smaller and faster it is, the more Hz you need to make it appear continuous instead of broken into a 'phantom array'.
Now for a less extreme example - take a standard first person shooter - you have a much more natural motion of objects in a scene - moving people, you moving and turning yourself etc. Sadly it's an example you did not include in your analysis - when presented with the sentence "I can tell the difference between 60Hz display and 144Hz display" you immediately jumped into the cursor explanation, completely dismissing the fact that people CAN tell the difference between these refresh rates IN-GAME. And yes, I am one of those people.
Even when you're not looking for it, 144Hz is just so much more fluid and natural in-game than 60Hz.
It is also so much easier to track fast moving objects at higher refresh rates. For example in first person shooters you obviously have to shoot your opponents who are moving - and that depends on your ability to track the moving object and keep pointing your crosshair at it - at higher refresh rates this is much easier because all the motion is more fluid and you get more frequent updates which allow you to make faster and more precise adjustments in real time (watch the LinusTechTips "Does High FPS make you a better gamer? Ft. Shroud" for a very detailed analysis of this). And that wouldn't be possible if our eyes were running a discrete framerate or if they weren't able to perceive more than say 60FPS as some people are claiming. If you had a camera at a fixed framerate, then increasing the refresh rate of the monitor that the camera was looking at wouldn't give the camera an advantage - the camera wouldn't get more frequent updates than it's own internal framerate. Our eyes do get the advantage of 144Hz over 60.
Now the next step - 240Hz is smoother than 144Hz but the difference is not as staggering as between 144Hz and 60Hz. Still some people can tell the difference and 240Hz is still not enough to completely fool our eyes in every single case. For example you can still see the phantom array while moving the cursor on the desktop.
240Hz also wouldn't be enough to simulate the J. F. Schouten experiment on-screen and here's the math to prove it and calculate the minimum framerates to do so:
Let's take a disc divided into 15 equal sectors, each 24° wide.
Let's say that for the effect to occur we need the sectors overlap by at least half of their width between frames so 12°
Alpha occurs at 8-12 cycles (revolutions per second).
Beta: 30-35 cycles
Gamma: 40-100 cycles
To have the sectors overlap by half of the width of a sector we need there to be twice as many frames per revolution as there are sectors - 15x2= 30
Alpha at 8 cycles * 30 frames per cycle = 240Hz, 12 cycles * 30 frames = 360Hz
Beta: 30 * 30 = 900Hz, 35 * 30 = 1050Hz
Gamma: 40 * 30 = 1200Hz, 100 * 30 = 3000Hz
Keep in mind that this is only to overlap the sectors on the fan by only half their width each frame. If we wanted them to overlap by 3/4 we'd need to multiply all the refresh rates by a factor of 2. I'd say that would probably be enough to fool our eyes into seeing the exact same effect on screen as we do in real life where the motion is completely continuous instead of broken into frames.
But this is, just like the cursor - a very extreme example - in most cases of everyday stuff, much lower refreshrate would suffice to fool our eyes.
Let's take a 3m long car going by us 30m/s - let's say we need it to overlap 5/6 each frame - it takes 0.1s for the car go past our eyes in it's entirety and so with the required overlap we need to have a frame time of 0.0166s which translates into 60FPS. But if we were to replace that big car, with a 30cm long drone going the same speed and same overlap requirement we would need 10 times the framerate or 600FPS for it to appear continuous instead of breaking into a 'phantom array' (several discrete images of the drone visible at once at different points).
And we know that in real life things don't break into phantom arrays and everything appears continuous no matter the speed.
That is of course because The human eye HAS NO FRAME RATE, instead it sees in a more 'continuous' way.
So in conclusion, the human eye HAS NO FRAME RATE but if we wanted to fool our eyes into seeing things on screen the same way we do see reality then 600FPS would be enough for the vast majority of everyday stuff, but there are some edge cases which would require 3000Hz or possibly even more.
That's also the kind of math necessary to understand how to fool a camera shooting a discrete frame rate. ;)
To me, the eye question is far more interesting. Because it also provides insight into "hacks" like strobing impulse displays.
Agree about the 60fps/Hz comment. That seems to be the sweet spot for most people.
@Filmmaker IQ I love that hack, these super bright HDR displays can use their brightness budget to dip more heavily into blochs law, and give superb motion clarity per refresh. Though the phantom array effect is still an issue.
so minecraft on a good gaming computer is reality ok.
This was PERFECT... THANK YOU!
Your explanation at 10:40 kinda blew my mind. I had to rapidly play/pause to see both images side by side and you're totally right, the one frame of the candle compared to the three frame have different perceived brightness. Super cool!
A little tip, the < and > keys will move the video one frame at a time when paused. Much easier than trying to time the pause, haha!
@@Rilumai Thank you for pointing this out, Rilumai. I've often found myself struggling to isolate particular frames on youtube, for example on explosions, crashes, magic tricks, etc. And now it is made much easier. Just pause the video near the moment, and then use the < and > keys to move frame by frame.
On VLC Media Player*, FWIW, the E key advances one frame (many other keyboard shortcuts for features of VLC not found elsewhere are listed in Preferences - Hotkeys; you can even change the hotkeys and their functions there).
* If you don't have VLC on your Mac, Windows, or Linux computer, give it a try. It is open-source and great!
@@GregConquest Hey, no problem! And I love VLC, I use it all the time.
Ofc it will be brighter cos you persieve it longer, so until you percieve difference in brightness you are a human who can percieve faster fps lets say 60 120 240 fps. So yeah your eyes definetly can handle High framerates films not just 24 fps.
@@serraxer if it's brighter because you perceive it longer that's more proof that it's closer to 10fps. That's how exposure works... The longer you let it expose the brighter it is. Hence 10fps.
But we don't actually see a frame rate.
An interesting nuance, if I show you an animation at slower rates (e.g. 3, 1, 0.5, etc fps), you can still see movement if you choose to. So, 10-12fps is not where we suddenly see continuous movement, it's where it becomes difficult to see individual frames. Subtle, but very meaningful distinction. As you say, we don't seem to see in quantized frames, so the 10fps seems more analogous to shutter speed. In reality, the limitation is probably either neural network speed (how fast your brain can process the data before the next frame arrives) or memory (how much mental space you allocate per unit of time, forcing your brain to overwrite information from the previous frames with information from new frames). You could probably test which one it is by finding your threshold for apparent continuous motion (the slowest frame rate that looks smooth) and then increasing the brightness. If the footage starts to stutter at the same frame rate, the bottleneck is probably NN speed (the extra light allowed your framebuffer to fill up faster, giving you more time to see independent frames). However, if you make the animation smaller or less detailed and it appears to stutter at the same frame rate (but at the same time does not stutter with more brightness), then the bottleneck is probably memory (mental "framebuffer" size). And it's not completely either/or. Alleviating one bottleneck can cause the bottleneck to move to the other side. In nature, things tend to be balanced, so I'm guessing the bottleneck is pretty close on both sides. I'm just making this up now, so my test logic could be flawed.
Also, the algorithm might not be main culprit. Frame rate might just be really interesting. YT regularly recommends to me your videos and live streams. The frame rate videos are always very compelling to click on. It's also because anyone who knows you, knows there's drama and a story around you and frame. So we know there's going to be extra emotion and passion in the video.
Be extremely careful making analogies like that. The brain and visual system are not cameras or traditional computers. There isn't memory in the sense of an accurate record of stimuli.
As you go further down the road of this stuff you start to realize that ultimately visual perception is largely a function of what we THINK we saw. It's all subjective.
@@FilmmakerIQ I agree. By neural network I mean actual networks of neurons which computer NNs were named and modeled after.
Still I think biological neurons and computer neural networks aren't as similar as we think they are. Just because they share a common word doesn't mean they are the same thing.
@@FilmmakerIQ Especially in the visual system, where so much signal processing has happened before it even reaches the brain.
It's not an accident that the graphics in the Bloch's Law were having a logarithmic mathematical expression for log T (Time duration) and for the log L ( Luminance ) - both factors have cumulative effects over Time and the logarithmic function describes perfectly the asymptotical limiting behaviour of the biological visual sensory system - in Neural Network science both Memory (data as a signal or noise accumulation) and Time (number of computational cycles and their duration depending on the complexity of the apparatus) are of signifficant importance!
The frame-rate man strikes again!
it's the sequel to the 24 series no one expected 😁
When you have a last name like “Hess” it’s in your blood to be a nazi about something…look at Rudolph Hess, worlds first grammar nazi…frame rate nazi????
30FPS??? RIGHT TO JAIL. RIGHT AWAY
The Anti-FrameRate man!
Wait until you tell them all what happens to colour when you turn the lights down. Oh boy they are in for shock and will learn just how much the brain "paints" the world.
Once again, you killed it with the depth and the clarity. I gave you a standing ovation.
I love this work that you did. I'm a cinematographer, so I know that if I try to shoot at 1/10 of a second, my video is going to have so much motion blur and it will be choppy because it's lower than the frame rate. But the reason my eyes don't see choppiness IRL is because they don't have a frame rate! But they DO see the motion blur. It's just so smooth because it doesn't have a frame rate that it needs to sync up with. It reminds us that we might be forcing the technology vs. biology comparison too much as a trivial attempt to understand our own biology, losing out on some other points of knowledge inherent in the biology. Maybe one day our screens will be more like the human eye? No more refresh rate or frame rate, just a persistent image flowing with the arrow of time instead of rigid increments of time. Similar to how a film projector is the inverse of a film camera, I'm thinking of a monitor that is the inverse of a human eye. Right now, monitors are more like the inverse of camera sensors, not eyeballs. Maybe it would need to mimic biology more? Some people equate rods and cones to pixels, but pixels are arranged in a fixed 2 dimensional grid. What if the screen was actually a 3 dimensional vat of organic "pixels" floating around in a malleable state rather than fixed? Maybe the next evolution of OLED screens. Sort of like the grain in video noise but they're actual pixels, dancing around extremely fast to generate a fluid, persistent image, and the pixels would move along 3 axes instead of 2. It would change the approach to pixel resolution, similar to the human eye. Take that a little further with machine learning and maybe... MAYBE... we can have something like "synthetic subjectivity", where each time we viewed something it was interpreted in a slightly different way by the display itself, sort of the inverse of the way the brain takes in and interprets image data subjectively, and you can toggle it on or off on the TV lol. But maybe that's taking it too far? It would at least be interesting and perhaps a new form of storytelling, like Rashomon but if Kurosawa was actively showing us a slightly new perspective every time we watched the film. We'd not just be viewing a pre-recorded thing, but in some sense the thoughts and memories of the machine, what it's "seeing" with its mind's eye and how its memory recall process evolves that story. I THINK this would be a more accurate comparison to human vision than current cameras and monitors.
An actual well-studied video, really good job!
Define well studied? This is nonsense
Morons will always think actual science is nonsense.
Fascinating!
One of the very best channels on the internet. Kudos!
3:45
Back in the days when cinematographers shot on film, there where camera's where the viewfinder looked through the shutter of the camera.
But, one half of the shutter would expose the film, while the other half would be used for the view finder.
So, when a still camera took a picture, with the flash turned on, and the camera operator would see the flash through the viewfinder, he was pretty sure that the flash was not exposed to the film.
Another milion views coming up.
PROPS for this, I enjoyed this video immensely!
Never seen this guy’s videos… but I can tell you his is fucking smart…
This might be one of my favorite videos in a LONG time. I will have to rewatch it a few times bc so many nuggets information . Absolutely love it!!
Spreading 20 minutes of science and finishing by saying "you can do what you want (I dont care I stick with my guns)", makes me think that this guy just want to affirm more than understand.
If it's a matter of whats "pleasing" or not, it's more subjective, so why spending so much time with science?
Because that's the mature way to understand the world. 🤷
Science isn't there to affirm your subjective preferences. You sound like someone that comes up with your conclusion first and finds "science" to support your beliefs.
That ain't science
BRAVO! Love the video. Something I would like to see studied with Visual Perception.
Back in 2011 I got involved with a study in 3D cameras. I had a setup that was absolutely amazing, and shot some material. We developed ways of handling L / R pairs of videos, and combine them into a single 3D L/R video. Sometimes when I was working with the equipment, I would accidently reverse the Left and Right eye as displayed on the monitor. What you saw when you looked at the monitor was very subtitle, but the effect would grow the longer you stared. You would initially notice nothing a normal 2D picture (not 3D). Slowly little areas of the video would start to take on a 3D ness, but it looked inside out. After 10 - 15 seconds of watching the reversed video, larger, and larger areas started to take on the inside out 3D effect. It was very interesting. I then stopped the video, and looked at another part of the room (reality), and for about 10-15 seconds reality had this inside out 3D look. It was a very strange experience to view the video in this way. I wonder if anybody else has done anything with reversed Left Eye / Right Eye perception?
That sounds absolutely trippy!
Please to a vid on MGM and WB's studio history!
This is such a fascinating and captivating video. I come from photography (specifically, originally astrophotography, where specialized CCD cameras are used) and I remember someone from that field telling me the eye has an "integration time" of 1/10 second. Not a frame rate, but rather in all that fluidity of the organic, biological system that is our visual apparatus, a "shutter speed" of 100 ms, so to say. Excellently confirmed by your experiment with the candle!
BTW, I do see flicker on a lot of LED lights where others don't. I always thought I'm crazy til I met others who could also discern it (particularly in peripheral vision - thanks for confirming this, too!). It's hard to find good dimmed LEDs where I don't see the flicker!
I think you've made it VERY clear multiple times throughout the video that the eye HAS FRAME RATE. Did I get that right ? 🤓👆
It's interesting that here recently 30fps videos on youtube such as this one always seem a little jittery compared to 60fps videos. It's kinda like how when I didn't look at a CRT for several years it was a bit hard to go back to looking at one due to the flicker.
I think the human eye's "frame rate" varies just like out ability to judge time durations of sounds. You can tell the difference between a 0.15 and 0.20 second sound but you can't tell the difference between a 2.15 and a 2.20 second long sound.
Watch the follow up video ;)
It's not that your eye actually see the frame rate. You see the artifacts of the frame rate.... The same artifacts that can be captured with a camera ;)
A while back I did an interesting experiment, I took an RGB (Red, Green, Blue) LED and fired each color one at a time to see at what point I'd see them as all lit at once. At 120Hz (40 times per sec per LED) I saw all the colors, but there was a bit of flicker. At 140Hz, (about 47 times per second per LED) it appeared as if the LEDs were lit all the time. I tried this with both 50% and 75% duty cycles (180 & 270 degree shutters) with the same results.
I'm not suggesting that the "frame rate of the eye is 40 to 47 FPS, but assume back in the days of black and white movies, IF someone had a camera that shot 1 frame of Red, 1 frame of Blue, 1 frame of Green, in a repeating pattern, that people would have seen the film in color. I'd hate to think of what the film costs would be to increase the frame rate to 120 - 140 FPS, but it would have worked. Come to think of it, an early CBS System for color TV had a frame rate of 144 FPS.
Really interesting. Now to complicate that - you're essentially testing the critical flicker fusion of your separate color cones cells independently - smushing them all together to try to get white. So about 47hz per color kinda makes falls in line with established science on flicker.
Your idea actually was one of the first color techniques devised by William Friese-Green in early 1900s. Instead of RGB, he used sort of red-yellow - like technicolor 2 strip and doubled the frame rate at the time.
The idea of "Sequential Color" was also what CBS's color system proposed in the early 1950s (as you noted).
But the modern use of sequential color is in DLP projectors that also have a spinning color wheel. :)
I just want to get on my soap box and say, "PWM is the worse. Especially when it comes to headlights."
Car manufacture use PWM for running lights and they are definitely below my flicker fusion threshold.
Eyes can detect different framerate, they don't have a frame rate.
Exactly. So can a camera that has a single frame rate.
This is fascinating and while I have long thought the human eye has no frame rate as it has no shutter and works on a biological system that interprets, predicts and even fabricates inputs to create vision in a very different way then machines, I would now love to see a video on VR headsets and why motion below about 74 frames often make people sick.
I think it has to do with inherent lag of head movement and visual perception. What I've read it's the same thing as motion sickness from sitting in cars.
7:12 I bet that the 10 to 12 FPS thing is the reason why 24 FPS became a thing because 2*12=24
Edit: another reason could be the fact that there are 24 hours in a day and 24 is divisible by 12. Some people say that 12 is a natural number to us.
No, 24 was sort of arbitrary
@@FilmmakerIQ I know it was random. I just thought that is why they randomly thought of 24 FPS.
When they picked 24fps they weren't even talking about fps. They picked a value of feet per minute which happens to equate to 24 fps. Back in those days they didn't talk about frames per second.
@@FilmmakerIQ Oh
Dude. The bottom line conclusion of this video was an art of a speech 👏
Beautiful.
Interesting. I have to explain Persistence of vision to visitors to my lab when they are amazed they can see a laser pulse scatter from a screen what was only 10 nano (billionths) seconds long. That would be an effective frame rate of 100 million frames per second based on that false argument.
No but poor videos with low frame rates hurt your eyes I believe that you should do a video on that man🎉
Moronic
@@FilmmakerIQ rotten banana 🍌
Very interesting. Honestly makes me wonder about some of my fundamental assumptions about why even stable low framerates look, not just feel, so awful in games but great in hand drawn animated film.
The only game I'm aware of that even tries a low framerate approach to animating is from the GDC talk about Guilty Gear Xrd's Art Style. While it works amazingly there I havent been impressed with the few 3D animes I've personally seen try this...
cool, so nice to have the term flicker fusion to describe my experience, and neat that good low-light vision (which I also have) is related. A high framerate crt driven at 60Hz, or half-wave rectified LED strings are noticeably flickery to me, even without “motion” (though certainly even more obvious as my eyes saccade)
You're the first person to use the word cicade! Kudos. That's a topic I didn't even get into.
@@FilmmakerIQ :D (Just looked up how it's spelled now; sorry I got it wrong before.) You already covered phantom array well, and had so much to cover in this video!
When I was shooting that spinning pattern test. I could essentially get the spinning pattern to freeze by flicking my eyes about. So it's really into saccades adds another layer of complexity!
Is no one going to mention that the eye has no frame rate?
Ironically the people getting the most angry about this video insist that it does.
I'm First!!!!!!!! 10FPS is Pure Gold!!!!!!
Fantastic explanation! This should be required watching for anybody working with motion-based media.
Great video, long time ago I wondered how much resolution the human eye had and the answer was similar, it's too complex to put a single number, but never thought about the frame rate, definitely it's impossible to measure the eye with camera technology terms
Idk. Faster monitor frame rates feel better on my eyes. Movies and Games 144 feel better than 60 and even the almighty 24 for movies.
My experience is entirely opposite. 144hz looks terrible showing 24fps compared to a different 60hz monitor.
Did you try 144 and 60 on the same monitor?
I've been doing a lot of research on this area. I think I can tell you why.
But it doesn't have anything to do with the frame rate of the eye, in fact it helps to think of the eye as having 10fps as a conceptual starting point.
Asking "what is the frame rate of the human eye?" is simply the wrong question to ask, because the answer is NaN. The correct question to ask is "What is the largest frame rate the human eye is able to discern?"
That's also a wrong question to ask. What does discern even mean? What kind of motion are we talking about? 0fps will suffice for a still image...
@@FilmmakerIQ This is a pretty dishonest reply. The question is obviously implying the fps limit required to mix reality and simulation. Why would we be talking about still images in a discussion over fps? You can try and play VR at 10 fps. Good luck with that.
No kidding that the eye does not see in a "frame rate." But we can discern faster than 24 fps (let alone your ridiculous 10 fps proposition). When I watch old television shows, I can discern every frame, playing back in real-time at, for example, 30 fps. And it's not just your 60 hz monitor shot at 24 fps, either. I see the _sequence_ of images like a flip book, each distinctly preceding the next. I'm not only talking about during rapid motions or fast pans, either. I mean just normal human movements in relatively static shots.
I would estimate that I lose that ability near 35-40 fps. I can not see that way, for example, at 48 fps (which Peter Jackson's _The Hobbit_ was shot at) or in the occasional European 50 fps videos here on RUclips.
[EDIT: I just got to your 1 frame vs 3 frame candle example, and neither appears "brighter" to me, either. Bloch's law may be valid, but the threshold duration probably varies between individuals.
When I watch at 2x speed, indeed I see what Bloch's law predicts.]
I think you can just boil this argument down to, the human eye doesn't have a shutter.
Well yes, but neither did the old video cameras and they still had a frame rate.
Also get your troops away from my border Miriam
18:31 "a stimuli" lol you mean "a stimulus."
Unless it's more than one stimulus. It can identify as what ever it wants ;)
This is the best explanation I've seen on RUclips about the phenomenon of human vision!
Very interesting information. Thank you for taking the time to make this video.
human eyes have no framerate but theyre have latency! and more latency too the brain with more beer intus bcz the ram of the brain is freezing :D
It's not just that people can see a difference between 60 and 144 Hz; it makes you better at fps shooters because you can indeed see more accurately what is happening as you spin around. The eye is able to get more information from a 144hz screen than a 60 Hz. 90 Hz movies is a similar situation and I would love higher fps movies. I can never tell what is happening in try fast scenes at 24 Hz but I can at 90.
None of that is disputed or addressed here. It's that you do not see those those high frame rates.
The eye does not get more information, it gets different information.
You might get a sense that the input lag is less with high frame rates but your eye is not discerning individual frames.
@@FilmmakerIQ i don't think it's "discerning individual frames". Obviously the eye doesn't work like that. More information over time as opposed to less information over time is being exploited by the eye and the processing that consumes its electrical signal. How that actually happens requires a better understanding of how photoreceptors work in parallel to produce an electric signal and deciphering that electrical signal in order to understand how it is originally encoded. I bet AI will really help with reducing this complexity into meaningful facts and statements about how eyes work
That's the wrong way to look at it... What happens when you take your eyes off the screen? Do you suddenly get even MORE information hitting your brain than a high refresh rate monitor?
Instead the better understanding which is backed by psychology is that its all "apparent motion" after around 12 fps. Each just that different frame rates will have different artifacts and feels.
Sure there's more information on the screen but when it hits your retina, it all boils down to the same signal. In that sense the camera shooting the monitor experiment demonstrates that you don't need frame rate to discern different screen refresh rates.
Frame rate is still an artificial creation after all.
@@FilmmakerIQ imo you are looking it the wrong way.
You can see 144 fps. This doesn't mean you eye or your visual system runs at 144hz or more. It means you eye/brain have the abilty to see the difference. It does however extract the information from higher frame rates.
"The eye does not get more information, it gets different information." So it can see it. There is an observable and repeatable tests that prove it does (Ufo test) also many many experiements conducted by different source looking that the effect of higher frame rates in game supporting it.
I don't think your issue is that you can't understand the evidence but more you don't understand the word see. All that is required is for you to see and understand what you're looking at.
Show moniton at 60 or 144 or 240 fps and you can see the difference and therefore see the fps.
If I say I can see a Tiger your arguement based on your logic would be but you can only see half of it so you don't.
Also I like how you complain about reductionism in your video but as soon as you disscuss this you say "What happens when you take your eyes off the screen? Do you suddenly get even MORE information hitting your brain than a high refresh rate monitor?"
@@Rezeakorz No, I disagree because YOU HAVEN'T defined what it is "YOU SEE".
If I showed you a still image, would you be able to see the difference between 60 or 144 or 240fps? No of course not. So what is it that's different when judging those frame rates with motion - it's artifacts OF the "apparent motion" - the steps in-between. It's how that motion is constructed (remember - there's no motion in a screen - it's an illusion)
But those same artifacts are visible to a camera shooting an objectively low frame rate - or even a still image.
So a camera which does not record MORE or LESS information whether it's pointed out the window or a pointed at a computer screen, would still be able to tell the difference between frame rates. Why? The _total sum of information_ the camera records does not change, but what is recorded is DIFFERENT.
That final bit about what happens when you take your eyes off the screen is the nail in the coffin for your argument about "seeing more information" - do you get an information whiplash when you take your eyes off the screen and look at the real world? No of course not.
That's because your reductive conclusion about "more frames more information" is wrong. It's more frames, *DIFFERENT* information.
Awesome video, I do stop motion and I have always asked this question. Thanks for answering
6:00 but cursor phantom array isn't the only way that we can differentiate between 60 and 144. Just play a game at 60 and 144 and you'll see the difference, but in this case, the camera will not be able to catch it, how do you explain that ?
Sure the camera will catch it. Same thing as the phantom array effect.
Here's an article with the camera catching it: blurbusters.com/the-stroboscopic-effect-of-finite-framerate-displays/
Haha ...
Oh man, people compare God with Computer. Do i need to say more?
This channel is pure awesomeness! Thank you for all the entertaining information!!
Thanks for your awesome content but also because of your expressed rejection to "higher than 24 fps" every time I see a multimillion masterpiece falling apart its realism by a wrecked pan makes me realise how much better an ice hockey match looks at 60 fps.
A bad artist blames his tools. If a pan is wrecked by the frame rate, it's the person doing the pan's fault not the frame rate.
I agree Hockey matches look better in 60 fps. But movies ABOUT Hockey, look better in 24.
@@FilmmakerIQ Thanks again for the impressive quality content that's why I won't wreck your good craft with a rather absurd discussion nevertheless stutter it's on my top list of filmmaking things to avoid, have a good day maestro
@@FilmmakerIQ i don't think it looks better at 24 fps. At the very least you should acknowledge that this is your subjective opinion which I don't think you will.
Of course it's a subjective opinion. It's my subjective opinion along with the entire filmmaking industry vs your uneducated and inexperienced opinion.
Going to cinema after being exposed to 144hz refresh rate 16 hours a day is jarring. You can perceive individual frames of 24 fps movies and it's jarring.
i just checked the old video and I have it disliked. Well that's embarassing.
I’m not really understanding your argument. Is this a defense of 24fps? Are you making the assertion that the human eye/brain can’t see smoother motion with higher frame rates?
I hope not. Because that would be absurd and the evidence you present here is flimsy at best. For the record: I’m a supported of 24 fps. But for maybe different reasons. 24 fps is very limited. And by limited I mean the information contained in a 24p video is limited versus 60Hz or 120Hz or whatever. We tend to think of information only as resolution. How detailed a given frame is. But frame rate is a sort of temporal resolution. The higher the frame rate the more information for our brains to process.
This is why I like to play games at 100+Hz and watch movies at 24Hz. With a game I want as much information as possible and since the medium is interactive the extra fluidity, responsiveness and *information present with a high frame rate is paramount.
Meanwhile, with movies, i actually want the opposite. I want *less information. Film is not an interactive medium and, with the exception of Avatar, movies are created with actors, in costume, on set. Our human brains are very good at spotting fakes and movies work by faking stuff. For the illusion of film to work you need that slight obfuscation of the lower frame rate to hide details our brains might otherwise recognize as fake.
This is why, round about, Hobbit and it’s sequels didn’t work. People complained about the makeup and special effects being ‘bad’. When I’m reality the effects were just as good and in same cases much better than those used in LOTR. The issue was that 48Hz HFR presentation allowed too much information to reach you brain…
Meanwhile Avatar 2 mostly works because, well, it’s one step away from a video game. As there are rarely real environments or real actors in the movie you’re basically just watching a CGI video game cut scene. It’s easier to suspend disbelief.
Did you comprehend ANYTHING from the video?
"Flimsy argument" is apparently the last 100 years of the scientific research into the subject... No the problem is you don't understand how vision works.
No, temporal resolution is not MORE information for brain to process. That's been a mistake we've been making for a long time because we've been oversimplifying the human visual system. If it were, real life would be an infinite amount of information and our brains would melt in everyday life.
Even taking your eyes off a screen would result in some sort of information whiplash.
Temporal resolution affects the character of motion. Once something is motion, it's the same AMOUNT of information to the brain, just different character.
@@FilmmakerIQ Yeah, with all due respect, I don’t think you understand this topic as well as you think you do. Again, no offense. There are lots of things that I don’t understand but as someone who has been around and involved in displays and display calibration for a long long time what you’re arguing here simply doesn’t add up- or, again, perhaps I’m misunderstanding your argument. Again, I’m a fan of 24 fps but a literal child (a LITERAL CHILD PLAYING FORTNITE) could prove this wrong in 30 seconds. There is a reason why high refresh gaming monitors are a thing. There is a reason why professional gamers play on 24” 1080p monitors with crappy color and crappy viewing angles just to get 300+ fps. And I can tell you as someone who reviews front projection systems that there are benefits to higher refresh rates beyond for gaming.
I recommend you head over to AVSFORUM, Blurbusters, and check out the Linus tech tips video on refresh rate as a starter.
@@sage11x He is not saying that a person can't tell the difference between frame rates, and he is not saying that higher frame rates don't appear smoother. What he is saying that our eyes don't see in any frame rate whatsover!
You literally did not understand the video whatsoever.
The majority of my research BEGAN at Blurbusters. I poured over the written articles (I'm guessing you don't even know what the BlurBusters Law is, you just played with UFO animations)
Blurbusters was where I was introduced to Bloch's law in the first place. BlurBusters's founder has even said on twitter:
"10fps animations stops being slideshows"
twitter.com/BlurBusters/status/1629740791089299457
After 10fps, it's all about removing artifacts of vision and discrete frame rates. He lists up to 10,000 fps to fully get rid of all of them.But something happens at 10fps that is significant.
Oh yeah... PC Gamer magazine had an article on it years ago talking to vision scientists:
"And studies have found that the answer is between 7 and 13 Hz. After that, our sensitivity to movement drops significantly. “When you want to do visual search, or multiple visual tracking or just interpret motion direction, your brain will take only 13 images out of a second of continuous flow, so you will average the other images that are in between into one image.”
www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/
That's 12 years ago. This shit's not new... you people have just forgotten it in favor you fake models of human vision. You can't even be bother ed to read your own sources you think I should read.
Do I believe you actually experience the world 10fps? Did you actually see the part where I said it was "silly". (11:35) The brain and visual system are doing a lot more work, even if bundles of the retina are only operating at 10hz....
How many times in the video do I need to put up the graphic "The Human Eye Does Not See a Frame Rate" before you morons get an effing clue?
This is an absolutely ridiculous topic. Wth thinks cameras and monitors work like the human eye?
I guess this goes to show how dumb people are about our own understanding of our bodies. This is also the reason I laugh at people who are worried about killer robots/AI.
I receive quite literally thousands of comments saying the opposite ;)
@@FilmmakerIQ I wish I could say that surprised me.
Spend some time in online frame rate discussions.... Some gamers in particular get offended if you don't think they're some sort of superhuman
cameras were originally designed after how human eyes work. Light coming to a focal point, it has to be focused, enough light has to traverse through the optics bla bla bla
@@FilmmakerIQ I play with those people online, I already know they're ridiculous.
Can't agree more!
Some of the experiments are really fascinating.
64 frames per second and I can see the actors wigs and fake mustaches.
I enjoyed John’s video, but I was annoyed that he tried repeatedly to convince us that the human eye has no frame rate. Had he said it fewer than 99 times I might have believed him. “Me thinks you doth protest too much.”
Obviously the human eye has a frame rate. Unless you think the quantum world doesn’t exist. Just admit you don’t know the frame rate. It’s understandable, because the frame rate varies, just as our level of awareness varies.
Ugh... Like you think you're capable of perceiving the quantum world... This is why I had to keep saying it. Because idiots still think there's a frame rate.
13:33 so THAT'S WHY I notice a slight bit of flickering on my Nintendo switch OLED in the dark when it's in my peripheral vision
It's why stressed out people in office buildings get even more stressed out by the overhead fluorescents lights! ;)
good explanations, Now a video on what the F stop range of the human eye is lol
That's actually pretty straight forward to calculate. I've seen it to be between f/2.8 and f/8
Oh nice!
I am a gamer i play on 240 hz monitor... i get what your saying with 10 fps but please explain to me this, on reaction time tests i can get a consistent 135ms click so by the assumption i see at 10hz does that mean my muscle can react in 35ms because that sounds outrageous, and i know i see the box go geeen at about 20ms
Not all the cells in eye are synced up to 10hz even if the individual cell has a reaction time of 100ms.
Watch the follow up: ruclips.net/video/7zky-smR_ZY/видео.htmlsi=9SQo6Stn9Dmn5wk3
I'm watching this in 60 FPS 👍
hopefully the interpolation software doesn't mess up the visual experiments.
Very interesting 😄
Simple rules: is it shot on film, or on digital cinema cameras - only 24 fps (no stupid Hobbit 48 fps experiment - that was looking terrible in the theater), is it shot digitally on a regular consumer camera probably will be better to have 30, is it generated real-time 30, 60 or more with display synchronization. And potential lack of motion blur in digitally created and rendered materials, is a case to use more frames. Motion =/= frame rate :) And here is a great Gamery explanation, to trick eye/mind in to looking at 30 fps screen looks like 120+ ruclips.net/video/IvqrlgKuowE/видео.html it is a nice trick.
You can shoot 24 on consumer cameras, but I do agree that it takes a little more skill. Motion blur is definitely useful.
That async thing is really really cool!
Could you theoretically have overlapping exposures, hence a shutterspeed longer than the frame rate? I bet it would look really dreamy!
(Really it's one over the framerate to get what's normally the longest possible shutter speed, but we all know what you meant.)
Not possible when recording in real time. You could only do it in post.
I still have an idea to make a video talking about what it would look like if you summed up a few frames. It would actually look like an echo - multiple frames overlapping each other.
@@FilmmakerIQ I was thinking animated graphics, but if you can retain a smooth motion blur in each frame then it should be possible to add this effect in post from something shot at a 360 degree shutter angle.
Yes, 360 would be how I would do it. As I said it would be an echo. Because the motion blur would end after the next frame began. Essentially each frame would back up in time...
The effect in After Effects is actually called echo
Personally I do think certain shots in movies look terrible at 24fps. But only in specific cases. I never had issues watching movies but there has been a situation or two where theres a panning shot of, for example, a forest with thousands of leaves in the image and at 24fps it just becomes a messy blur. One such case, cant remember the movie it was so rough that after the panning shot I had to close my eyes and blink a few times. Its like it physically hurt my eyes bexause of all the noise.
If it's messy blur, it means it wasn't directed well or that the director intended it to be messy blur.
11:24 "Can't have a shutter speed longer than the frame rate" Not trying to be a pain but there's no reason this isn't possible. With a bunch of lights sensors all working asynchronously (as in the eye) it would be surprising if there WASN'T overlap between individual sensors detecting light and resetting. Even ignoring the eye there's nothing mathematically impossible about that. Just overlap the windows of time the frames are drawn from.
As you say, applying the concept of frame rate just doesn't work for the eye. At best we can look at the consequences a frame rate has on film/video, and see whether we can find similar limitations with the eye, then pretend it has a sort of frame rate.
I said so in the video if you just watch it a few seconds more.
But no, you can't actually have a frame rate longer in the shutter speed. Not in a physical camera that has a frame rate.
Edit: What your describing can't actually have a frame rate... Which is what I keep saying about the eye! :D
I'm not sure why this is such a big thing. Of course, the human eye doesn't have a fixed framerate. The point is not quantity vs difference. The point is that you qualitatively perceive movement better at a higher framerate than 10fps. What 'better' is, is not a matter of physically seeing more individual images as a hole because we don't run in 1s and 0s. But because you are able to perceive the difference, most people will rather buy a 60fps monitor than 10 or 30.
YES!!!
I whole hearted agree but every time I tell people is not "more" it's "different" I get yelled at and told I'm an idiot.
lmfao, literally telling people that video is actually just still images. I cant believe you need go over this... The concept is in the term... Frame rate, Frame, e.g. still picture. Rate, e.g. rate of still pictures being displayed...
People don't really understand the implications of the fact that it is stills. Its a foundational issue that causes a lot of confusion because all they know is discrete frame rate displays
you made 2 whole ass videos and continue arguing with people in the comments which is admirable honestly. And yet after all of this the simple question remains unanswered, people hate low framerates in games. A fact that is independent of science. It is like smoking. Smoking kills people yet so many people smoke. I don't care what the science is, I hate low framerates. What you really need to be doing is that you need to show the same scene in 24 fps and 60 fps as a comparison. Then the people should prefer 24. Find or film a scene like that and you will win this discussion.
I've already talked about that. That's like comparing grape juice to wine. Of course you're going to like the grape juice if you compare them side by side.
And I'm not talking about gaming - quite literally said so in the video. Why are gamers so sensitive????
You really dove deep for the preparation of this video. Consider me impressed!
I do agree the human perception of movement doesn't have a frame rate.
As far as I remember, your initial argument was we shouldn't strive for more frame rate in movies, because 24 is all you need for movement and more than that is just extra effort when creating effects and filming a movie.
I still disagree with that statement. Movement is still more fluid at higher FPS, so there is room for improvement.
Also 3D movies at 24 FPS stutter when there are fast moving 3D effects. The stutter is gone on 48 FPS projections.
Now I can't say if the increased production cost makes sense just for more fluid movement. I can agree that probably that's the deciding factor.
By the way, in the experiment you do at 5:44 with the 2 monitors and the cursor shadow, you should distinguish between "signal refresh rate" and "LCD monitor matrix response time" and "screen backlight refresh rate".
Most monitor manufacturers market their "LCD matrix response time" by citing "Gray-to-Gray" value of 1 ms. But in reality their "Black-to-White-to-Black" response rate is much slower. That's the reason you see multiple cursors with the naked eye and on a 24 fps camera.
On top of most of the older LCD monitors used fluorescent backlight which flickers with power supply frequency at 50/60Hz.
Newer gaming monitors use white LED backlight, but control monitor brightness by strobing the LEDs. So you can have a 144 Hz monitor but configured at low brightness you'll still see something like 60 FPS.
Gamers take advantage of this backlight strobing to minimize those shadows caused by LCD/pixel response time, by synchronizing the signal refresh rate with the backlight strobing, so when the frame changes, the backlight is off for a millisecond or two. This lowers your effective monitor refresh rate, but makes moving objects with cleaner edges and easier to identify.
You might want to check the BlurBusters web site for a better explanation.
The same effect was also visible on the older CRT monitors, but the fluorescent layer (hit by the electrons) would shine for a much shorter amount of time. I still miss my old CRT monitor.
For me personally, I stop distinguishing motion fluidity difference at around 80 FPS.
So if fast action scenes would run at higher FPS in the movies, it would be a great change, at least for me.
You have my argument for why 24 fps is here completely wrong.
But in one of the next videos I'm going to talk about what I think is an underappreciated feature for watching movies and it definitely relates to backlight: BFI and flicker.
And FYI I discovered Bloch's Law from Blurbusters! :)
Camera pans are the most obvious culprits in 24 fps cinema. They honestly look horrible, no exceptions. Maybe some shots/scenes where the camera is stationary but the objects are moving look better at 24 but the moment camera itself starts a rotation it looks horrible at 24. As a gamer after getting a 165 Hz screen I honestly cannot even stand 60 anymore so I have an idea where this guy is coming from. I ridiculed 30 fps for over a decade and advocated for 60. Of you spend your whole life admiring masterpieces at 24 without experiencing higher framerates them ignorance is bliss
monitors refresh the screen at fast all the phosphors wood go dim
I love the guy he quoted as being an idiot upset me but I thought this guy was the same person :)
But I have 165hz monitor and a 60hz and the biology has nothing to do with your eye in the same way a camera measurement or quantization etc not how biology works and neural synapses etc
I'm going to need some vinaigrette to go with that word salad
12:14 Here's another way to think about it. The human eye sees no less than 10 Hertz per second or the equivalent of 10 Hertz per second.
That doesn't make any sense
@@FilmmakerIQ That's based on the 10-12 FPS theory
I think I might get where you're going but it's logically incorrect. The eye doesn't see in frame rate. So it's incorrect to say it sees at least 10fps... Because it doesn't see at any higher frame rate either...
@@FilmmakerIQ I'm talking about what the human eye sees on screens, not real life
You have to stop thinking that way. The eye doesn't know it's looking at a screen vs looking at real life.
good video, but your monitor demo there around five mins in doesn't properly illustrate anything for either side.. the pixels on an LCD can't change instantaneously. It's true that the two monitors refresh at different rates, but the pixel response was likely similar on both at around ~50ms. Naturally, this would mean that those after-images would appear no matter the framerate.
This also would explain the three dots you saw in your demo shown shortly after.
No. This has nothing to do with afterimages. You aren't the first (or last to make that mistake).
Any exposure longer than the refresh rate will result in multiple exposures regardless of the G2G. Exposure is ADDITIVE. The refreshes on the monitor ADD together while the sensor is exposed to light.
Also the camera here is static, so it would not be affected by G2G. You need a pursuit camera following a "moving object" on screen to see what you're talking about. That's what that camera on the slider in RTING videos are.
So what I am hearing is if you have eyesight as good as a fighter pilots and the reflexes of an elite athlete, the upper limit necessary is 220 fps. Any more than that and you won't be able to see any details to react to whatsoever so there is no real advantage.
Nope. Completely wrong.
Watch the videos, this one and the how to stimulate reality one.
Would the speed of light perhaps be considered the limit of our frame rate, if we want to call it that for human vision?
The reason we see anything is really down to light, which has a speed limit.
Not speed of light, but Planck time could be considered the "frame rate" of reality. But that's so much higher than anything we can physically experience.
Your eye doesn't see.. your brain does..
I don't think it's that the eye can't detect fast flickering, I think it's just the brain simplifying the image
No, I think they determine the critical flicker fusion (especially of non humans) by looking at the impulse from the optical nerve.
If you think about it from a chemical point it makes sense. The eye cells need time to collect light, send the signal and then recharge. Flicker fusion definitely exists at the eye level
I appreciate you trying to convey that human vision doesn't fundamentally work in a framerate, and I actually learned a bit from some of the examples... but I feel like this didn't answer what, to me, is what most people are actually asking about when they wonder "What frame-rate does the human eye see?". What I THINK most people mean by that, and what I certainly am wondering, is really better phrased as "At what point is something in our vision for such a short duration of time that the average person won't consciously perceive it?". A practical test of this would be to have displays at varying frame-rates, and asking people if they can tell the difference between different pairs at increasing frame-rates until you hit a point where most people cannot perceive any difference in fluidity between the two displays.
I suspect that whatever that threshold is is going to change based on both who is looking as well as what is being shown on the display, but i'm curious if that's something people have tried yet. An alternative approach would be to have increasingly fast objects fly through somebody's field of vision until the person doesn't perceive anything having gone by. Again, though, that will likely vary based on the object's size and other factors (A train moving by at a million miles per hour is still gonna be easily seen if it's long enough that it stays inside your vision for a appreciable % of a second)
Of course, this is also really not measuring your eyes or your vision, it's measuring human perception, and at what point enough information is being fed to your brain's visuals systems to where you consciously pick up a blur across your vision or a slightly smoother animation as being a distinct difference or not.
At what point is something so short that we don't perceive it? That question is already addressed in the video with regards to Bloch's Law. There is no limit to the shortness so long as there is sufficient light to be registered by the eye. You can see a nuclear blast at a trillionth of a second and it would feel as long as a cell phone flash ( though it might be magnitudes brighter). Same concept as a flash and discrete frame rate, as long as there is a flash it will be captured.
Next the question if what's the highest rate you can see which behaves as real life.... You cannot have any hope of understanding that question until you understand the fundamentals I laid out here. Concepts like Bloch's Law and flicker fusion are how your can assess that number.
I addressed the question of what is the frame rate of the eye. From this understanding you can approach the questions you think are pressing (questions that are not even really about the frame rate of the human eye)... but you cannot begin doing so until you have this grounded in your mind first.
brain registers a signal from the receptors on the back of our eyes in about 5ms, depending on the light intensity, you can flash a high intensity image followed by a low light intensity image at low frame rate and you won't see the second image. Eye also have light retention, meaning receptors do not switch off instantly. There is also a fact that triggering one receptor triggers a close one because of the magnitude of stimulation. In other words, we have ghosting just like monitors do.
Flash two lights in succession is flicker fusion.
I wouldn't draw the ghosting comparison.
Be very careful drawing inferences. The eye does not actually work like a camera, even a lot of visual processing happens before it's sent to the brain
For gaming, there's more than just aesthetics involved, as it is an interactive medium and response time and detection of details can make a significant difference. It is more understandable things may get more subjective when it comes to non-interactive media though.
That's still an aesthetic ;)
If the controller latency is low enough. Then framerate becomes redundant. Unless you have health issues relating to it.
@@BrianBaastrup Among other things, it's about reaction time.
You can see the 3:2 pulldown judder in the NTCS DVD version of the famous train station crane scene (Jill's arrival) in 'Once Upon a Time in the West' as the camera rises over the station house to reveal the town. Same scene in the PAL version has no judder but the pitch of the soundtrack is too high (coz 25fps). Before blueray was a thing I was annoyed enough with both these artifacts (for that particular film) to buy both NTSC and PAL versions of the DVD and remux the NTSC audio with the PAL video slowed down to 24 fps.
No you can't.
I never went through with the 3:2 pulldown video because my experiment demonstrated that you don't see frame by frame. (I tried a bunch of different ways to try to tease out the effect) Even with 24 frames your eye takes in about 3 frames at a time, and when I discovered Bloch's Law, there's no way you're discerning the pulldown.
So whatever was causing the judder of that one shot, it wasn't 3-2 pulldown.
@@FilmmakerIQ You can see that it's limping compared to regular 24 or 25 fps. You can also see when motion judders with mismatched frame rates, e.g. 24 fps footage is shown on 50 Hz refresh display. Does not mean you're seeing 50 or even 24 individual frames but you can still see that movements like pans and tilts aren't smooth but have a displeasing stick-slip quality to them.
Again, it's not 3:2 pull down. It could be an encoding error caused by mishandling the "dejudder" which would result in an accidental 12 frames a second. That looks like it's struggling. I recently had to deal with that in a video someone sent me.
But the reason why it's not 3:2 is because you think straight 24 looks good on your 60hz display. Also as I demonstrated in this video, you don't see individual frames but collections of frames.
24 on a 25 hz display is different because the pull down for that is a hiccup even second, that's much more visible.
@@FilmmakerIQ Okay, thanks for the clarification. Sounds like an instructive experiment to do for myself: convert 24fps to proper 3:2-pulled-down 60 fps and see how it fares.
@@FilmmakerIQ Did the experiment with the 23.97 fps bluray version of the same scene and you're absolutely correct: can't see judder in the 59.96 fps 3:2 pulldown conversion. In fact, synced up side by side on a 119.92 fps refresh display sometimes I thought the original was smoother, sometimes the converted version was smoother, so basically I can't tell.
Nice comparison of Biology vs Technology. There is also a bit of history with film settling on 24fps as the cheapest framerate that delivered motion and a 180 degree shutter for pleasing motion blur. Then we got TV's who's sync rates were driven by the cycles of the Electrical Grid. We have been conditioned to prefer this look all our lives. These days however, technology has moved on, and all of our displays (including TVs) run at least 59.94/60hz (NTSC) or 50hz (PAL) requiring some form of Pull Down Pattern (repeating frames) to have these lower frame rate videos play at the correct speed on these higher frame rate displays. This now gives the opportunity to effectively double our temporal resolution while keeping the same motion blur we are so used to. I made a short clip of the Wagon Wheel effect for both 29.94 @ 180 and 59.94 @ 360 for your viewing pleasure! - ruclips.net/video/9k6nCGEHeWQ/видео.html
No 24 wasn't the cheapest frame rate. It was actually higher than many of the silent films around. But not quite as high as what some exhibitors were showing. 24 was actually kind of the middle of what was acceptable.
And no, you don't double the temporal resolution and keep the same look. Yes 24 is partly conditioning, but also partly because that look is unique. You can't up the frame rate and think you're going to achieve the same effect.
@@FilmmakerIQ Thanks on history back to pre-talkies! I did not say you get the same look (but the same motion blur per frame). It looks the same when there is little movement but importantly less juddery with faster motion as each 2nd frame is unique and not just a repeat. The result is there are no "gaps" in the motion that you get with half temporal resolution that is 180 degrees. So, I agree that it can absolutely look different, if you prefer it or not is a different argument... (I do).
A lot of people approach me with regards to shutter speed and blur. I hope one thing people take away from this video is that we don't see motion picture frame by frame. We see it as a collection of frames that induce the psychological effect of apparent motion. The more frames, a different kind of apparent motion is generated. This is why 24fps at 1/48 and 48fps at 1/48th look entirely different even when each frame has the exact same motion blur. So focusing on motion blur per frame is the wrong way to look at it.
Now regarding gaps in the motion blur... This should be part of my next video. With flicker (which is traditionally part of motion picture) gaps aren't as noticeable. That's going to take some work explaining... But that's why in the industry 24 is often described at buttery or creamy smooth.
@@FilmmakerIQ Looking forward to it & love the discussion! Keep in mind that (and I'll use PAL frame rates for this to keep the maths nice and round), on a 50hz monitor you are always watching 50fps. If the content is 25p you're displaying each frame twice and each is 1/50th of a second (nobody these days is watching 25p on a 25hz monitor). Given it is a 50hz monitor, the other choice is to deliver 50fps still at 1/50th of a second and instead of repeating frames, you get a slightly different one. Yes it looks different. Also the buttery smooth look of Film has a lot to do with the filmmaking technique to limit the impact of these lower frame rates (like well-established pan rates to minimize juddering etc). These professionals have perfected that look for 100 years.
Wait to you see my demonstration on flicker ;)
It's a little known and underappreciated aspect of this topic but utilized also in high frame rate gaming and VR!
Also the eye darts round and builds up a picture and a lot of the detail gets filled in by our brain no?
The brain is the primary driver of what you see. That's even why you can make a claim if 10fps and still not experience 10 fps. It's the brain that creates your visual existence.
Another example of this is the retinal blindspot. You'd never ever notice that you have a hole in your vision because you brain didn't let you notice.
Mind blowing video.
Fascinating topic.
All the cartoons and anime you love, 12fps.
Well not all. Disney is known to animate on 1s :)
I'm surprised by this video. I am a motion portrayal enthousiast. And i was expecting you to get everything wrong like 99% of videos or artciels on the net about the subject.
Since you seem to understand the phantom array effect. It would be really really nice, if you could tell your audience the correct answer to the question most are actually asking really.
Which is what frame rate/refresh rate is necessary on a screen for it to look like real life motion. In other words to brute force its discrete nature away.
I think you probably know the answer, (if you don't it's simple, just populate every pixel described in the motion by 1 frame/refresh).
So the way to explain it should mention how it depends on the speed of the motion, resolution, ...
When you calculate it, you can see how even with our current style of screens used on computers and games people play on it, you already need 4 to 5 digits frame rate/refresh rates to achieve it.
It would do so much to help if you could explain that, because the vast majority of people incorrectly think that 120, 240, 360, 500 or even 1000Hz is enough. While the correct answer is that there are benefits way beyond any of these arbitrary numbers.
Thank you in a advance.
Ah, and for the record, I hate the easthetic of 24fps and I find it incredibly uncomfortable to watch. I hope it dies ASAP. But even if you seem to disagree, I'm glad that at least, you were correct in your explanations. This is so refreshing to see. I have to praise you for that. Great job!
Yep - upwards of 10,000fps and maybe higher.
You'd arrive at the same number if you were asking the question how can I get a CAMERA pointed at a screen to get rid of stroboscopic effect.
But that wasn't the question of the video - My exploration gets into even real life - my focus isn't narrowly interested in just screen tech.
Vast number of people even don't understand the parts that I'm saying in this video. In fact they get pretty angry about it.
@@FilmmakerIQ I get it, you're looking at it from a different perspective.
i know what you mean about people not understanding and getting angry.
That's because unfortunately the vast majority of the information available on the internet about motion portrayal is just plain wrong.
People like you who have a correct understanding of it and have a platform to educate people are extremely rare.
But every bit helps and I thank you for this video.
Second time I'm pleasantly surprised already.
i clicked your video expecting to roll my eyes at everything you said but you really did your homework and I couldn't fault a single thing.
And now on top of that, you read comments and answer to feedback.
We need more people like you in this space.
Have a good one, man!
When I shoot in 4:3 and I want it to "look like-old school TV". it's 30FPS. Any "wider" and it's 24FPS. I will let the actual pros mess with others, LOL.
Interesting video. I agree that the human eye does not have a frame rate like what we would describe with a video camera, but, I do tend to think that like what you covered, there are some interesting numbers that could be conflated with a frame rate if somebody wanted to argue that.
My view on it is that there are some frame rates on display devices that perceptually are more pleasing to the human eye. Sports on TV at 60 fps is one example. It just looks better at 60fps, and much like gaming, would probably look better with more fps. Gaming... more is better. I don't tend to think that the eye has a FPS per se, but rather has a rate at which it can detect movement smoothly, the lower end being down in the 10-15 FPS range, and the upper range a lot higher than that... How much higher? I think that depends, but to your point, the eye is not a video camera, and while it has a lot of similarities to a video camera, it does not work like one, and therefore does not have a "FPS".
Really high IQ stuff 😊
In essence: are we framed? 😉
The frame rate people can perceive at is incredibly variable. About 12 frames per second is required for the phenomenon of something being animated, 60fps is perceptible for the average person at resting heart rate and 120 is perceptible for the average person at periods of high adrenaline or focus. Beyond that is basically a field of visual athletics, down to gifted people with unmatched dexterity and visual acuity, people who have "trained their eye" in very specific ways. Being able to glean any useful information at these frame rates is incredibly difficult, likely only possible for experienced pilots and pro fps gamers. If more people could perceive at these frame rates i guarantee DLSS 3.0 would look way more shit to way more people, because you'd actually be able to notice the gross artefacts.
🙄
Watch the video. Because you are speaking nonesense.
Stop glorifying gamers. An accountant could see things in spreadsheet that the average human couldn't. There's nothing superhuman about experience and familiarity.
@@FilmmakerIQ You're very blunt. I like you. Subscribed.
not so much really training the eye, but just getting used to higher fps makes lower fps look worse, 30fps never bothered me and used to look smooth, but after playing at 120fps for just a couple weeks going back to 30fps looked like a slideshow
"Looks like a slideshow" is such an NPC phrase. You guys don't know what a slide show is.
After watching a million 24fps movies and getting used to that look, any high frame rate video trying to be cinematic just looks like garbage. But I also get frustrated when my editing software isn't giving me a snappy 60fps.
Different mediums different expectations.
@@FilmmakerIQ no need to be a dick in your responses, you are just used to 24fps, you are literally the embodiment of npc responses because you are set in your ways and refuse to remove yourself from your singular mindset, I'm talking mainly about the difference between 30fps and 120fps gaming, tv and cinema isnt quite as bad because it uses motion blur but if you were to be exposed to only high frame rates for a few weeks you would have the same exact experience watching 30fps content, especially on oleds that don't have a slow grey to grey, on lcd screens it's not as noticable because it has panel induced motion blur, but it's really sad that gatekeeper people like you are holding back the future of superior visuals with nostalgia addiction, 24fps doesn't look better, it actually looks like shit when you have something better to compare it to, but you'll never understand what better is until you actually experience it
Yes the dots on the rotating drum! I don't know if you can but after I watch it enough I can actually make the rotation ,at will, go left or right, it must've been the mushrooms in my younger days???
To me, the only upside of high refresh rate monitors is when something on-screen moves very rapidly. You would see the path that the rapidly moving object takes, more clearly.
The eye and brain of gamers need more than 24 fps to see something clearly. I love my TVs ability to interpolate the framerate from 24fps to 120fps. Action scenes looks much more clear to me and are easier to follow that way.
Are you maybe planning to make a video about connection between frame rates of the VR googles and human eye? I actualy looked up that video trying to understand why people are feeling sick while VR goggles have lower fps. Can you maybe explain that topic?
I'm planning on doing one talking about the frame rate needed to completely get rid of any artifacts of discrete frame rate (simulate actual reality)
@@FilmmakerIQ Can't wait for the video then! I'm writing my bachelor's thesis about making most immersive VR space taking into account possibilities of current technologies. That would be very helpful! Thank you for that video and I'm waiting for the next ones too! 😀
Not gonna watch this whole thing but the eye doesn't operate in fps.
You could have watched a few seconds and heard me explain that.
I think then the question is being done incorrectly. I'll have a go at rephrasing it...
"How much fps would it take on a digital screen, to something feel as real and smooth as looking out of a car/train/airplane window? In other words, indistinguishable to the observer and able to fool them.
That's the wrong question. Because the answer to that would also apply to a camera with a fixed frame rate.
I think the answer to that would still depend on a lot of factors.
We don't see in framerate, as noted at the start. The way we do see is more akin to a 10 hz monitor that doesn't have any way to sync frames. Without any way to sync, things like flicker and judder will be detectable with high motion if the eye isn't being bombarded with an excess amount of frames to ensure there is never any overlap during the high motion sequence.
So I hear you saying that the human eye definitively has a frame rate of 10 Hz? ;)
Yes most definitely.
The spinning drums wouldn't film randomly for me even when staring at the dot, but if I imagined they were spinning in a specific direction they suddenly were. I could get the spinning drums to flip simultaneously or desync at will.
I never thought the eye had a definitive frame rate, the idea always seemed rather silly to me. It takes a certain amount of exposure/time to form a fully complete picture maybe that is around 10 hz. The eye can still see things much faster than that, at the cost of some of its detail.