What was your first memorable bluray? Do you plan on upgrading to 4k? Also... anyone know why the blood field you're constantly looking through is blue? The answer will be revealed in my color video! And for the worried... I was putting sunbutter in Wheatley's mouth. It's totally safe but... maybe a little annoying. :D
I would say Bram Stoker's Dracula was my first Blu Ray viewing, it looked far superior to any DVD version and I thought the DVD looked good. This 4K Blu Ray thing is a cash grab and nobody that I have talked too about it gives a flying fuck about 4K UHD, they are the next CEDs or Betas. They won't sell very well and will die out, that's what I predict......not enough interest in them so far. If they do get embraced I'll be very surprised.
Humans don’t “see” in a quantifiable resolution. At least not one that’s comparable to a digital output resolution for a display. The main appeal behind 4K is not just the resolution, but the size at which you can display that resolution and still maintain an acceptable degree of definition
We were filming in 4k raw in film schools in 2017. However, most of the time you compress down to 1080 for distribution. The main advantage of filming in 4k before compressing to 1080 is that you can make more corrections in post before noticably affecting the quality of the film.
This is the same reason pro cameras often shoot in 15 megapixel or higher (some as high as 50 megapixel). Besides allowing you to produce huge wall-size posters that stand up to detailed close examination, it gives you tons of room to crop in post, and to do edits like object removal more cleanly.
@@chiaracoetzee The benefits in the editing room are nice but the biggest benefit comes when printing on 70mm film, which can have a horizontal resolution equivalent of up to 12k (estimated; depending on pulldown direction, perforations per frame, stock quality, etc)
@@chiaracoetzee same reason pro audio is 30+ bit at up to 196kHz, even though you cant hear better than the standard 16bit 48k used for distribution. It gives us headrooom for processing and avoiding aliasing
Not coke, but they typically have an adderall or some other amphetamine with them to pop if they need to on a long mission Also the Nazi armed forces, not known for doing things lightly, were rather liberal users of methamphetamine
@@ianthompson2802 The Fairchild Republic F-105 Thunderchief, despite it's fairly not good-looking combat record (more than 300 of the 800 built were destroyed in crashes), they managed to pull a combined total of 27.5 kills against enemy Mig-17s. 3 were from the use of AIM-9 Sidewinders, and 24.5 were from the use of its internal M61 Vulcan 20mm rotary cannon (the .5 came from a shared mig kill between both an F-105, and fellow F-4E Phantom II).
1440 suck honestly, as its own standalone resolution its great, it just doesnt scale well at all with 1080, thats the nice thing about 4k. It scales perfectly with 1080p and similar resolutions.
Hey man, I work in the visual effects industry. I want to correct something you said. Most of the work in television and film is shot in 4k or higher these days. Pretty much every professional camera does it. Dp's want the larger number of pixels work with in post, as you can shoot wider and crop, or add in visual effects and maintain higher fidelity than if you did the same with 1080. And yeah, I've worked with 8k footage as well. It is ironic because we used to work in lower formats like SD in post production and upres HD at the end. Now it is the opposite. Computers are fast enough to edit in high resolutions, cameras shoot in higher resolutions, and then the show is brought down to 1080 for the marketplace.
Lol, that's awesome. I upgraded my gaming equipment to record 4k for the same reason. Not because I think that 4k is better, but because sometimes I can make a better video for my youtube channel if I can crop the screen and not lose any quality. It lets me functionally zoom in to get angles that would otherwise be impossible.
Yeah and i always look for lesser quality so i dont get sick, even had to find a used tv to watch stuff idk what ill do when regular hd isn’t available anywhere. Trying to play games on a xbox one x is terrible unless I plug it in an old standard hd tv, i keep it set to 720 and do great, dont feel like i miss anything and thats with or without my glasses. 4k is just too “ fake looking”
rotoscoping on lowres and low shutter footage is an absolute bitch to work with. like we can add that "motion blur" from a matched shutter speed, Its just so much nicer to work with footage that was shot on a high like 1/144 shutter, MMMMMM clear crispness. I can dirty it up for the people who like that "wave you hand in front of your face blur" thing they're obsessives with
* barely. Text? Sure. Basically, applications when you're sitting right in front of the monitor. I am sitting in front of a 4K monitor as I type. Have to turn my head to see the other side of it. It's more like having two 1080P screens side by side with no gap.
It would get more important the bigger screens get. There is a reason 480p looks clear on a small monitor. There will probably be a point where 1080p on big monitors will start to look unclear. Think about the future.
VR is an even bigger fad than 4k monitors. I don't even know anyone who actually games in VR. And I'm saying that as someone who used to develop VR games. The company that I used to work for was one of the early adopters; we received Oculus Rift prototypes before VR gaming was a thing. It was cool but I immediately realized how much of a fad this was. People these days play games where you can prove your expertise and skills. Games that tend to be extremely competitive are the ones that are the most interesting to watch and play. Shooters and MOBA's are very good examples of this phenomenon. VR gaming is basically a "prosthetics" simulator. You play in a virtual environment and the only way to interact with it is by means of prosthetic limbs. It's awkward at best. Sorry for rambling, I just had to get that off my chest. Have a nice day.
it's sad really .... so many people gonna call that movie a classic... when really it (and the whole series ) was just visual garbage that borrowed form every thing with nothing really original in it.
@@saucyx4 and there in lays the problem with the younger generation. calling that movie (the matrix) a classic is just shameful. the concept is ripped right form terminator , and the action sequences are ripped right from the stills of anime. the character development is really non exsistant for any character outside of neo himself. and his development is a little thin and litterally happens entirely in the first movie. looks i'm not saying it's a bad movie. It was an enjoyable movie. but it lacked originality and depth. it just doesn't deserve to be veiwed in the same vein as movies like E.T. or terminator or jaws.
Came back to watch this 5 years later where 4K is the norm and totally streamable, 1080p is what your grandparents use, and 8K is the new "Nobody can tell a difference!" Now you can pick up a cheap 512GB thumb drive for $10 and with modern compression you can fit a half dozen 4K videos on it.
Filming is 4K can be useful if you are planing to downsample your final output to 1080p. This offers video editors the luxury of 2x digital zoom with no loss of resolution. I used to film in 1080p and render in 720p all the time. That 1.5x digital zoom was critical for making crisp looking virtual camera movements.
I was looking at your comment but it was 1080 60 megahertz 5K and which I could not be read due to the resolution in pixels that are deviated from the 6000 BTUs that I'm used to
Sosa Kun , why do so many people make this comment on RUclips ? Is it a special club you are in ? Anyways the more you click on suggestions the worse it will get .
Ignoring Omni Torrent... ...Ideally you don't want to see the _dead_ pixels. Mild pixelisation is usually tolerable, but having visible dead pixels can really ruin the enjoyment of a screen.
@unrepeatable raddish8k currently is pointless unless you have one of those really really huge TV's (so not talking 55" here)... but if those TV's get the norm then 8K will be great.
The decent enough difference is mainly on 4K-specific marketing made super color videos. Once it becomes part of home with regular input devices (streaming, BRay, etc.) it becomes a very expensive object side-by-side with a good-old 1080p. The big differences I can sense is contrast and light, typical for a new TV device. They do get brighter, sharper and with higher contrast. But that is not 4K specific. Regular consumer should not spend the savings.
35mm Celluloid Film (Which is what was usually used to film movies pre-2005) can digitise up to 16K just so you know. The true definition king is a 35mm projector at 25-70 frames per second. Plus then you have Panavision Ultra-Wide Screen which is 70mm (used for Ben Hur, Laurence of Arabia and The Hateful Eight).
I had a teacher who drove 4.5 hours in a blizzard to see a one night only showing of Lawrence of Arabia in full 70mm. His take was that even though you can get access to that now easily, it was worth it to see it before his eyesight went bad 😂😂😂
It’s worth mentioning the image size of 70mm movie film is roughly the same as 35mm stills film, because of the direction change. That’s also where 4:3 came from, as stills were 3:2 but if you split them in two and rotate them you get 4:3. And also that film grain was larger when 70mm motion pictures were more popular, a more recent 35mm motion picture has about as much resolution as a 70mm film from the heyday of 70mm. Of course all 35mm film from before that is much grainier, and indeed tops out between 4 and 8k instead of 16k.
I watched this and then checked his channel looking for the “I was wrong video” listing how readily available 4K content now is on Netflix, Apple TV, etc; how UHD blu-rays exist and growing in popularity and how HDR and other features make 4K tvs far superior.
Or how increasingly higher resolutions in video games increase the rendering resolution and shows more detail in game and let's you see farther, but keep in mind, this video was a year ago.
Here's the deal, though: Your fovea, that all-important center of the eye that can see the finest detail, could be looking anywhere on the screen depending on where the viewer's attention is currently. So you have to design the whole screen uniformly and assume the viewer can be looking at any position at any time. It doesn't matter so much whether the TV has overall more pixels than your retina or not: It matters whether or not the TV will look crisp to your fovea. No, you do not design a TV based on the idea that the fova, the most important part of our eyes and the part that will definitely notice a lack of detail, is irrelevant. That would be stupid, basically like most of this video.
Would be interesting to see a screen with eye tracking used to bring into full focus only where you are looking and save bandwidth with the rest of the screen.
Adam Savage I think some games are doing that to boost performance, but I haven't heard of TVs doing that. Again, TVs don't know where you are looking, and even in cases where a TV show or game is intentionally doing some blurring, the TV itself is still gonna have the same resolution everywhere. That said - you are likely correct as far as games go. If a game can blur part of the image where they don't expect (or don't want) players to look, they can reduce a bunch of detail in the blurred areas, which would improve the performance of the game.
Adam Savage Well yeah what you describe with the textures is part of what's done as well. There's actually a lot of tricks behind the scenes of any game. As far as variable frame rate, there's FreeSync (AMD), G-Sync (nVidia), and HDMI 2.1 (hopefully everybody, but adoption is slow).
I honestly hate motion blur in video games and I always turn it off if I can. I find that the "faked motion blur" actually makes me feel ill. I find it interesting that the Soap Opera Effect can make people feel ill.
It’s fascinating how differently perception works for different people! I hate the Soap Opera Effect, I can stand projectors with colour wheels, and I can’t bear looking at backlit LCD screens without protective glasses for the blue light!
this is so misleading.The number of pixels in an image as nothing to do with the capability of the human eye to see one particular pixel. It has to do with the size of the display in which you will see that image. Suppose you take a picture of a person with a camera with a resolution of say 640x480 pixels. If you were to see that same image in a 43" inch tv you would see that image very pixelated as if you were to see it in and 15" screen monitor that image would seem ok. With the advance of technology and the increasing size of displays it is necessary to capture a picture/movie with as much pixels as possible so you can see them fine in a big screen. And yes as of 2018, 8k is coming to public houses. You've made a prediction that 4k would took 5 to 10 years to be used by most people and you're way off that reality.
yea.... the funny thing where I used to work we were actually showcasing our prototype 20k displays and that was almost 3 years ago (20k displays for showpieces in large malls/venues) it was kind of just showing it to our customers what we were capable of.... fun part was playing half life 2 on it... the bad part >> getting yelled at by some higher ups because we were playing a video game on it... reason for it was due to not having access to the needed 20k footage to put the demo up... we were the guys who worked on site and tested all the racks before they got shipped out... we were told to get it set up due to a drop in by some regular big customers and they wanted to impress... after a few hours and not hearing back from the people that were sent to onsite locations we took it into our own hands to do the half life 2 lost coast demo (we set it up in house to showcase the ability with our software and expertise that you could run essentially any graphics on the fly with sound matching what is happening on screen in realtime... we did that for 4k-8k screens and upscaled it to 20k) it was odd that we impressed the customers after giving our explanation of the demo and all the applications they could use our system for... but get in trouble with upper management because we didn't use the regular demos (our demo was done by our own area for the tours that regularly walked into our area that were for customers... it was fun because the people visiting at our area were normally the technician guys that would be working our product at their venues and geek out and what they could potentially do) also.. sad to say true 8k.... you are looking at minimum 30k worth of equipment just to actually deliver without any issues in realtime even now... with the next wave of tech in the next 2 years it may be down to 10k unless prices go up for parts (retail those 30k parts would be worth in the 100+k retail range due to the needed proprietary software and expertise as well as the law for economies of scale....) granted... it is possible to make cheaper 8k but it would be at some costs of performance... buffering... possible sound desynching and what not... well theoretically i guess an epyc chip might be able to pull it off at around 10k cost now.... hmmm
8K requires ridiculous inputs and no interface will scale properly with it. I saw some Dell 8k monitor demos, you'd need a microscope to find the start button.
You forgot to add one thing to this - Viewing distance. The closer to the screen you are, the better a higher resolution is. Its like everything looks good on my 50" 1080P TV if Im laying in bed playing a game. If I move up and sit on the edge of my bed, I can see the blurring or pixelated look of games. When I'm using my PC monitor (21:9 1440P) I dont get that same effect.
15:06 "anything filmed before Last year (2016) will never be in 4k because it wasn't filmed in 4k" Yet here I am watching terminator 2 (1991) in 4k because it was filmed in 35mm and rescanned in 4k. 35mm has a maximum detail about double that of 4k.
actually lord of the rings was filmed using 8 k digitial cameras but the 8 k footate was deleted when they down scaled to just 4 K for archiving the raw footage , these guys woudl FILL hard disks per day with shooting footage , YES 8 K back in 2003 it was a thing
And here I am watching Star Wars Episode 4 with an incredibly good 4K rescan quality, I'm amazed and this guy on youtube is just jealous of 4K tv owner and advocate his theme very well but come on, get an oled 4K tv and see if you can't see the difference brah
I had dual 4k 60hz monitors for my PC, and after a while, I decided that higher resolution was pretty much pointless for me. I switched to 1440p 144hz monitors, and I'm MUCH happier with it.
I have dual 4K 144 Hz monitors which is kind of best of both worlds, but those might not have existed a year ago - they're still quite pricy now. My girlfriend still prefers 1440p 240 Hz.
Am I able to identify individual pixels on a 4K image? No.* Am I able to identify a sharper picture from 1080p and 4K comparison? Certainly. (*I think that's the whole idea)
Same thing with framerate. But the argument that the people watching this probably aren't the average consumer definintely does have merit in my opinion. When you start looking at people in their fourties (average age where I live. not sure about USA) their vision is often just not as good as someone in their twenties or even younger.
For me, its something i dont notice nearly as much until i go back. Like, i have a 1440 144hz monitor for my pc, and i can *not* go back to 1080/720 30fps on my ps4. I imagine if i upgraded to a 4k monitor, i would notice the difference, but not as much as i would notice going back to the old monoitor.
Sarcasm I agree. Though I also agree that the jump from DVD to Blu Ray was waaay more impressive than the jump from 1080P to 4k. We are beginning to see the law of diminishing returns coming into effect with resolution.
The fact we only have 7 million cones in our eyes is irrelevant to pixel resolution or density of an image. Eyes don't see the world in a pixel-like way, and a vast amount of the processing is done by the brain, the amount of cones isn't really an issue. Eyes are not true analogs of digital cameras. The real world is made of continuous shapes and textures that are lit with infinite light rays. That's why games and movies at higher resolutions are perceptibly different, because they are rendered in pixels which the eye resolves as discontinuous and tell apart from at different resolutions. There is a point at which it becomes imperceptible but it's above 4K.
Exactly. Acting like we can't see anything above 7 million pixels cause we only have 7 million cones is so fucking stupid and just shows this guy has no idea what he's talking about. He chucks round the fovea and the blind spot like it has anything to do with being able to see higher definitions. Get someone to watch a 1080p video and then a 4k video on a large 4k TV, you can clearly see a difference and if you can't your eyes are fucked
the dot is a bad example, you would have changed your shirt between two shots, most people wouldn't have noticed it either. You absolutely see it when you know it's there. Anyway i had a blind test on my 55" 4K tv from 5m away and we could recognize what was 4K or FULL-HD. We don't see in 4K, you're right but as you said most of the details we see are in the center of our FOV, we don't focus on the whole screen at once, instead we pick multiple spots of interest of the image and we can see insane detail on this precise area, by doing that many times a second our brain merges all that in one single "image" and we think we see everything when in reality we miss a big majority of the screen. The problem is that the spots are quite impredictible from one person to another, therefore the movie can't just be the only areas we look at, it must have every single pixel in full detail even though we look randomly at only a few percents of them at a single time. That's why i think 4k is a good evolution of full hd. An uncompressed 4K movie would be 500gigs ok sure, but uncompressed 1080p is super heavy as well but that's why we compress it, to make it more accessible for our current tech. About 8K.... maybe i'll change my mind by seeing it who knows but i think at this scale we effectively can't see it, or not enough to be worth quadruple the amount of data. We should stay on 4K but maybe focus on better color accuracy, dynamic range or framerates for gaming. For having tried a few minuts a 4k 120Hz display with overwatch, i can say that the difference will be really worth it when it'll be more accessible.
What? The way cones work is exactly like a pixel analog, they absorb light and fire neuronal connections in the optic nerve to a proportional degree of light they are receiving, and once fired the cone has to return to initial state and won't flare for some time, which results in retinal persistence. if you have 2 degrees of light of the same wavelength hitting the same cone it will flare with the average of the 2. I don't know where you got this from, but its literally impossible to receive more discrete units of visual information than you have sensory cells to deliver.
@@pmangano There are similarities to how cameras and eyes receive light as you've pointed out. There are also differences (angle of incidence, colour perception and other aspects) which is why I stated that eyes are not direct analogs of digital cameras. The eyes are only half the story however, how the brain perceives and constructs images is the interesting and different part. Even though signals are sent in a discrete fashion images are constructed as continuous shapes as if lit by infinite rays. It's like a biological digital imager connected to a powerful AI. Leaving the latter out would leave you to some incorrect assumptions, which many pop-sci articles on the internet make.
120fps on a 60hz monitor does actually have an advantage, because the image shown on the screen is more recent and thus more accurate representation of the current situation. Linus did some videos on it.
yeah, the information is 16ms old rather than 33ms old when its displayed. Theres a lot wrong with this vid. He mentions the fovea and its concentrated resolution and never deals with the fact the fovea moves across every point of the screen so avg resolution is meaningless. Every part of the screen has to be as detailed as the fovea.
Here's the optics math to prove the *average* person can see 4K: In order to determine if a person can see 4K, we're going to use mathematics from astrophysics / photography. What we want to know is the maximum PPI (pixels per inch) that our eyes can resolve at a typical monitor / TV viewing distance. Then we'll compare that max PPI to the known PPI values of 4K monitors of various sizes. --------------------------------------------- First we have to determine the viewing angle of our eyes. The equation for this is: alpha = 2 * arctan( d / 2f ) -- where d is the size of our film (cone width), and f is the focal distance to our lens. We know these values for the eye. For the *average* human, the cone density in the fovea centralis is 147,000 cones / mm^2, or about 383 cones / mm (measured in one dimension.) We can therefore use 1 / 383 = mm / cone as our value of d. f -- the focal distance from the fovea centralis to the lens is also well known: the average human value is 17.1 mm. So the *average* human viewing angle is: alpha = 2 * arctan ( (1/383) / (2*17.1)) Plug this into a calculator (and ask for the result in arcseconds) and you get 31.49 arcseconds. So the *average* human viewing angle is 31.49 arcseconds. Anything smaller than this at the observing distance will not be resolved. --------------------------------------------- Now we'll find the average PPI that this corresponds to at the relevant viewing distance. Let's use two: 1) monitor viewing distance, 2) TV viewing distance. I just measured the distance from my head to my monitor and got ~31 inches. Call this D(mon). Let's assume a TV viewing distance of 6 feet, or ~72 inches. Call this D(tv). -------------------------------------------- Our viewing target is a single pixel. Its size will depend on the PPI of the display. Its size is relatively straightforward in terms of PPI: the size of our pixel (d) is (1 / PPI). You can show this pretty easily: PPI is measured as pixels / inch. The inverse of this is inches / pixel. So (1 / PPI) is a measurement, in inches, of the size of our pixel. In summary: d = (1 / PPI), so (1 / d) = PPI. -------------------------------------------- With this information, its very easy to find the max PPI of our vision. First, let's set up our equation and then solve for (1 / d): The equation is: theta = 2 * arctan (d / 2D) This is the same equation as before, with different variables. Theta is our viewing angle (we know this is 31.49 arcsec from our previous calculation.) D is either 31 inches or 72 inches for our purposes (although we can set it to anything), and (1 / d) is the PPI we want to solve for. Manipulating the equation, step by step (for those with no background in math): theta / 2 = arctan (d / 2D) tan(theta / 2) = d / 2D tan(theta / 2) / d = (1 / 2D) (1 / d)= 1 / ( 2D * tan ( theta / 2 ) ) ------------------------------------------- Plugging in our two values: 1) Monitor viewing distance: (1 / d) = 1 / ( 2*31 * tan(31.49 arcsec / 2)) = 211.3 PPI. 2) TV viewing distance: (1 / d) = 1 / (2*72 * tan(31.49 arcsec / 2)) = 90.97 PPI. These values represent the *maximum* PPI that the *average* human can resolve for monitors @ 31 inches and TVs @ 72 inches respectively. Values above these numbers are lost on the *average* human eye. ------------------------------------------- Only one last thing to do: compare to the PPI of 4K monitors / TVs. You can easily look this stuff up: 21.5 - 27 inches seems like a reasonable size for a monitor. At this size, the PPI of a 4K monitor is between 205 @ 21.5 inches, to 163.18 @ 27 inches. 50 - 70 inches seems reasonable size for a TV. At this size, the PPI of a 4K tv is between 88.12 @ 50 inches, to 63.2 @ 70 inches. ------------------------------------------- So can the average human eye see 4K? Absolutely: 163 - 205 PPI is less than the max of 211 for monitors at 31 inches, and... 63.2 - 88.12 PPI is less than the max of 90.97 for TVs at 72 inches. ------------------------------------------- This all assumes *average* vision. People with good vision have many times more cones in their fovea centralis. Science shows a large variance among individuals, and its not inconceivable for people with good vision to have up to 320,000 cones / mm^2 in their fovea centralis. To save you the time, this is equal to 21.3 arcsecond angle of view, corresponding to a max PPI of: Monitors @ 31 inches: 312 PPI TVs @ 72 inches: 134.5 PPI At 8K (!!!!) this means someone with very good vision could pick up the following monitors / TVs and get something out of them: 28.5 inch 8K monitor: 310 PPI 65 inch 8K TV: 135.5 PPI ------------------------------------------- So where does Knowing Better's analysis go completely wrong? He makes two downright terrible assumptions: 1) Every cone in the eye can only look at ONE pixel on the screen. 2) That pixel density and cone density don't matter. Also -- you know --- he just doesn't use the appropriate optics math at all. ------------------------------------------- Hopefully now: you know better.
The one single thing you have failed to realize is that 720p, 1080p, 2160p, etc. are NOT optical resolutions like measured in telescopes, binoculars, microscopes, glasses, etc, which aid the human eye in seeing real objects. They are OBJECT resolutions which actually limit what the eye can see. The world is made of atoms, not pixels. As you get closer to a tv you can see the pixels better and better, meaning the overall picture gains less detail. In real life, as you get closer to a tree for example, you see MORE detail. The serrations on the leaves, the texture of the bark, are all more clear until you are so close your eye can’t focus on it anymore. This is why more pixels on a screen will always result in a better and more realistic looking picture. There are almost infinitely more atoms in a space even compared to a 4K tv.
I think the overarching point is simple. At some point you're going to have more detail than your eye(s) can see no matter how close you stick your face to the screen.
"8K on the other hand? It will never happen" 3 years later, nvidia's already pushing 8K gaming with 3090 lol. guess it's all about anti-aliasing from now on. people love extreme things, when it doesn't make any sense no more, and companies know that and use that perfectly. I'm saying this but the idea of 4K and 8K excites me as well. what can I say?
@@recoveryemail1046 they put 4k tvs on sale all the time now to entice folks into switching. i bought my 4k samsung back in 2018 for only 420 bucks. at the time it was the highest rated 4k tv in it's class for image quality (49"). smart TV too with tons of features built in , even upscales images to be 4k (albeit upscales on dvd quality stuff doesn't look good ). blue rays and 1080p content upscales beautifully on this tv.
4K is the way now and there is a visible quality difference. I shoot in 4K, 422, 10-Bit, 400Mbps, ALL-iNTRA on my GH5s, and use to shoot 1080p 100Mbps 8-bit on my GH2 camera. The difference is vast especially in colour depth. HD is still very good but I shoot in 4K and 4K is really picking up. 8K is here also but here I must say I don't see the difference with a normal size TV. I would consider 8K for video cropping when editing projects.
There is absolutely a huge difference in 1080p and 2160p. It does depend on what size of screen. Larger the screen the more you notice the difference. I would agree though that at some point it will stop making sense to increase pixel density. Thats why manufacturers is going for color depth instead.
Johannes Snutt I remember reading a tech article stating they can make 8 and 16k now, but it's pointless due to cost, and nothing to watch on them anyway
I think you have to factor in DV/HDR, how close you are sitting to your set, and whether or not the source is true 4K to make a claim of "huge difference". Not to mention the quality of the set itself.
The term "4k" is ridiculous, it makes people think it's 4x the scanlines of 1080p when in reality they started counting *columns* instead of *rows.* But that said there is an obvious, noticable increase in quality when watching 2160p over 1080. This guy just doesn't care about quality. It's no surprise his video is poorly lit and looks bad.
I have a 75" 4k tv and at the distance of 10 or so feet I sit at I can still see a difference but a bit closer up and it's freaking night and day. It also must also depend on ones own eyes.
Im the same way as Hells Cat. Family just got a 70" Samsung 4K/HDR. For the size room we have, and that size of a TV, it makes 1080p look like shit! However the bigger difference maker on that TV for what I use if for which is my Xbox One X, is HDR. All that said, I would still rather keep my Xbox One X hooked up to my 22" 1080p Monitor, and pray that the game devs allowed me to use that extra processing power to run a game at 1080p 60fps. I can also see a big difference in games running on my Xbox One X compared to my One S, both connected to my 1080p Monitor due to the way that games were made with dynamic resolutions, and are now all downscaled from 4K instead of upscaled from 900p.
Depends on the content and how fast it is moving. There are times when a 4k image will be perceived as a tad sharper and crisper even though you cant distinguish the pixels.
With regard to frame rates: You mentioned it yourself: a person's visual perception increases in performance when they are more keyed up. An example of such a demographic are competitive twitch gamers, who are not only keyed up, but self-trained to notice details as a reflexive action to improve their competitiveness. You are correct about the motion blur, but remember that human vision is different from screens, in that it is _continuous_. There are no frame rates at all, because human vision does not chop up its input signal into individual frames, so it has the potential to pick up on any moment of time, regardless of its brevity, so long as the person is either 1: lucky, or 2: watching for it. So, while high framerates may be way over what is needed for passive entertainment (things you watch), active entertainment (things you interact with) would still benefit from those increases. As for that one pixel, it's easy to miss. However, if you were to make a 1-pixel line that extended over a quarter of the screen, it would likely be noticed, and some might just wipe at that spot, thinking that a hair was stuck to it. Individual pixels may not be noticeable, but remember that the smaller the pixels, the sharper the edges, and edge detection is something the human eye is very good at, a point you made yourself in the self-driving cars episode, so sharper edges would be noticeable. And the fovea is always the center of vision in a sense that can be moved; even if there's a small portion of your visual area that has high resolution, the fact that the eyeball is both redundant (there are two of them) and mobile (it can twist about) with a mental frame buffer means that the size of the brain's "virtual screen" is much wider. Of course, this is pointless where passive entertainment is concerned, but active entertainment would still see benefits in higher resolution and framerates, particularly in the previously-mentioned keyed-up twitch gamer demographic.
@@antanaskiselis7919 I have to agree. @lampros Liontos well put! He has some merit in the video, in areas, but its a broad subject matter with a number of applications. What makes me sad is the number of people who nuh-uh and personally attack people for their statements. Considering the majority probably have no knowledge on the subject and are great examples of a Dunning-Kruger. We need to take a note from Lampros and educate and discuss rather than assault. It solves nothing and betters no one.
Another point is being an artist. 4K is amazing for artists, as we can better produce larger scale and more detailed works. And higher definitions allows for larger monitors, (Frankly, buying a 4K TV as a computer monitor has no real downsides, and is more functional) since it allows for more references on screen, more tools, and other useful things an artist would either have to sacrifice image space for, or constantly have to dig out any time they needed it
Also speaking as a person who did some pro gaming as a teenager (Try not to judge me too much) We all turn off motion blur because of the chance it could obscure an important detail. In games like Rainbow Six, seeing the muzzle of a gun around a corner from 15 meters away can be the difference between winning and losing that round. If you have any blurring effects such as depth of field or motion blur, it will decrease the odds you notice.
Phenian Oliver smart kid, agree with you about the motion blur that obscures details and can effect gameplay because it makes it harder to spot something noticeable, great comments and don’t worry only morons stuck in the ‘olden days’ judge gamers even though it’s more wildly accepted now by adults
I don't know man. Soon as i upgraded to 4K I noticed an immediate difference. So noticeable that I sold 3 of my other 1080p display and got another 4K monitor
king Dione why would you get a 4k monitor. Dont you know that operating systems and vdeo games’s retina can’t see ina resolution that high? And if you play the game in a high resolution, the game itself would get dizzy!
We don’t see in 4K… that is apt; but that’s still not enough for the intended effect. We’re not trying to match what our eyes can see, we are attempting to overmatch what our eyes can differentiate so it gives the impression that we’re looking at real life when we’re looking at the screen.
@Morris Stokes sanitizers and wipes usually have alcohol or a similar chemical that kills chemically whether its a virus or bacteria, the way bleach would. so they also work for covid. different from stuff like antibiotics that target stuff only bacteria cells have
Laptop I have is 768p. I use a 1080p monitor for my business PC, and notice a difference. And for my gaming PC, I have a 4K monitor. I also notice A difference.
Oh come on. Did you even watch the video? Listen to what was said? "The average consumer will not be able to see a difference on a TV with a 10 feet viewing distance." Of cause you can see a difference on your monitor from nose-length distance...
As a professional videographer, this video gets a lot wrong: - Motion blur on cameras aren't caused by the frame rate, it's caused by the shutter speed. The general rule is to have your shutter double the frame rate. You can achieve this same "no motion blur" effect by shooting at 1/250 shutter on your camera. - There's plenty of 4K in 2018, it's on RUclips and Netflix. I know Netflix Original Shows are required to be shot on 4K now, even though most people will be watching it in HD. Optical media is dying, it's now physical media vs. streaming. Hell, most computers don't even come with disc drivers anymore. - 927GB for a 90 minute 4K movie is out of whack. You're correct if we shoot on an uncompressed RAW format, but 99.99% of people don't do that because of the huge space requirements. We have this thing called video compression that squashes down video to a file size more tolerable. My Sony a6500 4K camera can shoot 1 hour at the 60 Mbps setting on a 32GB memory card. Plus, newer codecs like H.265 are gonna squash down the file size even more and still provide clear quality. ✌️
Good points for video production. But not so useful for the average person watching TV at home from 10 feet away. And H.265 does require quite a bit of processing power that the average person does not likely have on tap. Moreover, streaming 4k requires a wide pipeline over the old cable modem to boot, not the 1 to 3 MB/s that most people can afford.
Optical media isn't dead, it has WAY better A/V quality than streaming. Lossless 7.1 audio, HDR, and high bitrates. Hell, a UHD Blu-Ray can be 100GB! Can you stream that? No. Internet speeds and data caps are why Blu-Ray will live on.
Kyle, for the record, my very affordable (about £1 per day) connection has a solid 200 Megabits/second speed, and it's not the fastest around here. (Others have 'up to' 400 Mb/s.) I have no data cap whatsoever, apart from a Fair Use policy which will temporarily halve download speeds of the top 5% of users, during peak periods, after an hour. That 200 Mb/s means that I can download 25 Megabytes per second, or 1 GB every 40 seconds from a good source. (A Debian Linux DVD takes about 3 minutes to download from the Netherlands.) So 100GB could be downloaded in 67 minutes or so; less than the time it takes to view most movies. And this is England. Some countries have higher affordable internet connection speeds. Surely the U/S has better, and lower-cost, internet connections than the majority of countries?
I don't think your examples are a good illustration of your main points here. 1 - Being able to distinguish an image flashed for 1 frame does not correlate with being able to perceive smoother frame rate. I'd be more convinced if there were a study that showed that people were not able to distinguish between movement on a 60hz display vs movement on a 120hz (or 240hz) display. I might not see an image that appeared for 1/144th of a second, but anybody with functioning eyes will be able to tell 144hz from 60hz. 2 - Similar to the above, being unable to detect a tiny dot on a piece of paper doesn't correlate to not perceiving the increased fidelity of 4k. The real question is can people tell the difference between 1080p and 4k, and the answer is a fairly resounding yes, though perhaps not to the degree of 60fps vs 144fps.
Not 100% sure if it applies but according to the Nyquist-Shannon sampling theorem for an optial expirience the frame rate should be 2x from what you can percieve.
considering framerate, linusTechTips made a collab with nvidia about different refresh rates (still haven't uploaded it at the time of writing), so I guess it might interest you.
And while I think your points are generally sound, your logic with photoreceptor cells is flawed. While, yes, we have far fewer cone cells than the number of pixels on a 4K screen, it is precisely because they are condensed that higher resolutions matter. We don't see the entire screen at once, but anywhere the vision focuses may well be seeing up to the number of pixels in that area, depending on screen size and viewing distance. Sure, we may not be able to discern single pixels, but we also can't discern single receptors; that doesn't mean they aren't carrying useful information.
This is exactly why the pixels per inch / viewing distance matters much much more then just "higher numbers = better viewing experience.". Glad you took the time to write out that reply man! :D
I think most retinal cells are ~"multiplexed"~ by a factor of 2-1 cell-to-neuron, as far as connectivity. Our expectations also directly impact scene perception. We're colorblind in the perifiry, but track movement better there. There are also cranial nerves that "intercept" the optic nerve and activate neck muscles based on info from the perifiry, which is also directly connected to the brain area that controls micro-saccades (eye twitch rythm). Bottom line is you can't stop at the retina when discussing perception. Vision /= perception
Without even realizing it, you just proved his entire point. "....we may not be able to discern single pixels," because the eye cannot cognitively perceive the difference between a line that's .01134 and one that's only .02269 from eight feet away on a 50 inch screen, but "that doesn't mean they aren't carrying useful information." Yes, you're RIGHT; they ARE carrying useful information. The kind of useful information that's employed for mental reprogramming and conditioning the viewers.
As with most engineering types, KB is making a failure of looking at raw numbers and ignoring dynamic effects. The flaw in the original CD standard was that the human ear, most of the time, can only hear 20kHz max... so 40kHz can capture all you can hear unless your range is higher than most. The problem is, while we can only directly hear 20kHz, we can hear harmonics and transitions at much higher rates, which led true audiophiles to sense that CDs sounded "muddy"... because they were muted in those high kHz harmonics and transitions they were fully (or at least unconsciously) cognizant of. Hence the MP4 standard, which allows variable frequency rates as high as 320kHz or higher (320 is the highest I'm aware of seeing spec'd for anything. No idea if you would ever need anything that high, or even higher). But most MP4s are at 120kHz and higher, 3x the CD spec. He's analyzing vision from that same purely mechanistic view, failing to grasp that our brains interact with information in a wide array of manners which extend into spaces he argues (incorrectly) that we are incapable of recognizing. Note, not saying he's entirely wrong on all counts -- just that he's not making a valid analysis because of ignoring those dynamic effects and interactions with the brain, which can interpolate stuff into areas we are unconscious of. A cognitive biologist would probably need to analyze his observations for validity, likely even doing some experiments where none have been done before.
@@nickbrutanna9973 Except that in the only actual tests done on the subject, like with actual double blind testing, audiophiles haven't been able to tell the difference. Audiophiles are just as likely to be swayed by their bias as regular people. The "harmonics and transients" stuff is 100 percent outside of the human hearing range. You actually need to listen to the engineering types, because they're also scientifically literate. This particular video does mix some things incorrectly, but not as much as what you have done.
I know other people have said this video is crap (it is) but I'd like to clearly and scientifically debunk it. You claim that the eye "sees" in 45-72fps, and therefore that you wouldn't see anything if it flashed for 1/72 of a second (~14ms). In fact, this rate refers to the temporal resolution of the eye: if a light flickered on for 1us (0.001ms) you would see it, but if it flickered twice 10us apart you would only see it as one flash - events that happen within one 'frame' are averaged together. That said, this 'framerate' is (at rest) not 45-72fps, it's 80-100, and thanks to aliasing effects analogous to what we see with mismatched resolutions it is possible to detect artifacts at 'framerates' of up to 1000Hz You claim that, because 4K UHD is 8.3MP and the eye "sees" in 2MP (very sketchy way of getting there might I add) the resolution is pointless - but the visual information from the fovea (your centre of focus) has much higher 'pixel density' than elsewhere so if you look at any one part of the display that resolution becomes useful. The average eye has an angular resolution of around 1 arcminute (1/60 of a degree) - that is, it can differentiate two objects 1 arcminute apart. This means you can differentiate two pixels more than 0.73mm apart 8ft away - which at 1080p means displays larger than 60''. This doesn't factor in the significant proportion of people with better than 20/20 vision, who would need better than 1080p to have a good experience even at 55in. (8K on TVs is pretty useless, as even most USAF pilots would need a >60'' display to see pixels at 4K - although I could see it coming to cinemas in the next few years.) Even with 20/20, you would want a higher resolution than this minimum to ensure that pixels are not just close enough that you can't see them, but also close enough that they smoothly blend like a real-life, analog scene would, avoiding aliasing. There are a bunch of other things which don't detract from the main point of the video but are still worth mentioning: - VR, AR and 3D are _still_ the next big thing, there's no point to the "Pepperidge Farm Remembers" - people who brag about 150fps are people with good enough hardware to achieve 150fps - meaning they usually have money to spend on a higher refresh rate monitor. Not to mention that having higher framerates on lower refresh rate monitors still serves to reduce input lag. - the 'Soap Opera Effect' is actually when TV hardware generates interpolated frames: 24fps content can be shown on a 120Hz screen either by repeating each frame 5 times, or by 'interpolating': trying to work out what should happen in between. Since we can tell the difference between 24fps and 120fps content, and we associate the 'feeling' of 24fps with cinematic content, this interpolation makes it feel more like TV, and less like cinematic content. It doesn't cause motion sickness (unless the interpolation algorithm is poor) it just makes the content feel less professional. It can usually be disabled so that frames are held for as long as necessary, so there is no downside to having a higher maximum refresh rate. - there's nothing wrong with the naming conventions. The 'nice pattern' of 480, 720, 1080 continues to 2160, just as the pattern of SD (Standard Definition), HD (High Def.n), FHD (Full HD) continues to UHD (Ultra HD). 2K and 4K were defined in the original DCI spec of 2005 as resolutions of 2048×1080 (yes, 1080p ≈ 2K) and 4096×2160 respectively, and since the new UHD resolution of 3840×2160 is close to this it's often referred to as 4K UHD. People (including you, I might add) misuse the term 4K but it doesn't make a huge difference or mean the conventions themselves are at fault. - "anything from before last year will never be in true 4k because it wasn't filmed in 4k" Sony were selling a 4K cinema projector back in 2004, so plenty of digital media since then is in 4K - not to mention the vast quantity of analog film reels which can be telecined to 4K or even 8K - "The current generation of Blu-Ray disc ... holds 125GB of data" there aren't any that hold 125GB. There are 25, 50, 100, 128, 200, 300GB versions, so perhaps you meant 128GB. Except... there's a 300GB variant? In fact 4K Blu-Rays were released long before this video, using 50, 66 and 100GB capacities, with films like Spider-Man 2, Ender's Game, The LEGO Movie, Kingsman all released with the format. - "a 90 minute 4K video is 477GB" depending on the encoding. Uncompressed, it could be 8TB. It could, in theory, be 1MB. It could, more usefully, be 30GB. As with all videos, you get better quality with higher bitrate. That could be a reduction in artifacts such as color banding, or it could be an increase in resolution as here. - you suggest that streaming in 4K would result in buffering. As above, the bitrate is completely arbitrary, so you could stream average-quality 2160p video with lower bandwidth than high-quality 1080p. It's a resolution/framerate/bandwidth tradeoff, so it's just about how much you pay for internet. - "and there are virtually no games in 4k right now" the vast majority of games have resolutions that scale with your monitor. The 980, released over a year before this video, can deliver solid framerates (on titles available at the time) at 4K, so the processing power (while large) is not impossible to achieve. - "Steam players are the PC gaming elite". Where in hell did you get that? Steam is the world's largest game platform. Steam players are pretty representative of *all* PC gamers. - "in 2016 only 25% of new TV sales were 4K"... representing a *huge* year-on-year increase which continues to this day. - "it's going to take another 5-10 years for 4K to catch on to the mainstream" I would already consider it mainstream, with most new TVs being 4K and most digital content providers providing 4K content. - "8k will never happen" It will appear in cinemas and on large high-end TVs - great poll there. sample size of 15. super duper statistically significant.
Huy Đức Lê uh in closer range that means you’re eliminating the spacial area of the pixels and screen and make them “larger and the screen smaller”. . . (Aka shrinking your screen and magnifying the pixels to the focal point of your eyes). You STILL won’t tell a difference. It’s a dumb topic and it’s all based off of $$$$. When you see an atom with your bare eye please let me know lol.... take a math class beyond highschool before calling someone arrogant. Jeez.
You aren’t considering the fact that the extra pixels give an increase of perceived contrast. This actually makes things look more realistic even if you don’t actually see all the pixels. It’s actually noticeable for a lot of people.
This. 4k gives a level of realism to textures and edging which has to be simulated otherwise by techniques like antialiasing which give inferior results and often result in visible artifacts.
Roy Riley so true. And because you can move your head in so much smaller distances you will hit between 4K pixels. So higher res will always improve your experience
I guess I must be a fighter pilot high on every narcotic in existence because I can see the difference between 1080p and 4k and the difference between 60 and 120 or 144 frames and it's quite clearly obvious
Slam......Do you always respond to other people that way?? Getting off the main subject here, but maybe you are looking at yourself in the mirror as you say all those childish things!
It's true. I run SVP whenever I watch video on my PC to increase the frame rate. If I don't have more than 60 fps, and I prefer 120, I don't want to watch. I probably won't buy anything less than 1080p either. I'm not a fighter pilot.
No. We don't. It has it's merits in giving off an optical illusion to making us FEEL like were REALLY seeing an increase in quality, but our eyes & brain have it's limits. It's called perceptive fantasia. I've even refuted this non-sense with link for a peer review study that explains how such things work between our eyes & a screen when it comes to resolution & FPS/Hz. I think anyone with any sense can concur that the ONLY thing that is going to TRUELY give you the most BADASS of all BADASS quality... is actually paying the price for the part upgrades inside of your RIG, not the monitor. The monitor helps, but it's really your computer/console superiority that is doing all of the real work. If I asked you to sacrifice either your 4k screen for a 1080, or your 1080 ti or even 2080 ti for a 720, then which one are you going to ultimately favor when it comes to performance & quality? ?....... Like, I would like to think that anyone with any SENSE would say fuck 4K Lmao! !.. You can actually play HIGH ass quality games WITHOUT 4K. People are giving to most of the credit to the wrong shit, when they should really be giving the credit to the people who are responsible for making their evolutionized parts. Even the TYPE of screen is more important than the size or pixel count of the screen. Like, having a cheap low-quality built screen type with 4k wouldn't really do anyone much justice would it? ?... I've got to say. For a generation that likes tech so much, they sure as fuck don't argue over the right points... This 4K altercation has gotten REALLY old a LONG ass time ago. These people are driving me INSANE with this vacuous absurdity.
@@dentongaemz8359 Well, there is good reason to not play fps's in 4k. It's hard to hit 60fps in 4k, unless like me, you actually have a 2080ti and a 4k monitor. 1080ti gets like 30-40fps. 4k is great for story games with great visuals. Not so great for competitive gaming, you could play in 1440p at 144fps.
We dont, but we think we do. So it doesn't even matter because our brain says it is better. And we are a brains. So in the end, we basically see 4k even if we dont really see it.
"Anyway, the cones in your eye are ONLY red, green, OR blue." Humans have three different cones that respond to short, medium, and long wavelengths of light. This produces color roughly corresponding with blue, green, and red respectively. Humans are trichromats, there are other living creatures that have more (or less!) responsive cones, these are called tetrachromats or pentachromats. "...there are only 2 millions 'pixels' in your eye." Humans can see far more color than that, about 5 times as much actually. The RGB color value for traditional displays is limited, it's called the color gamut. Most displays shoot for ~90-100% sRGB which does not cover the whole array your eyes can see. Further, RGB is not quite accurate to how we see, HSL is much closer. That's Hue, Saturation, and Lightness. Essentially it's color, intensity, and brightness (where rods actually matter). 4K TVs are slowly pushing the envelope here by increasing the bit depth of displays and achieving a broader color gamut. Now...your whole argument about 4K being essentially worthless because it's too many pixels is simply bunk. Pixel density is based on screen size and pixel quantity and importance varies depending on the viewing distance and visual acumen of the person in front of it. It's hard to quantify an ideal density due to the difference in viewing distance, but there is a reason why an iPad with a retina display looks so sharp compared to your average 1080p TV at equal parts distance for their relative display sizes.
Yes, a pixel is the individual R/G/B elements of the pixel arrays. Knowing Better said: "Now get this: there are 8.3 million pixels, and there are only 6 million cones in your eye, but that isn't the whole story because each pixel is 3-pixels in one: red, green, and blue. ... Anyway, the cones in your eye are only red, green, or blue and while the ratio isn't exactly a third if we match them up to make a pixel, that means there are only 2 million 'pixels' in your eye. Before you say anything, rods don't matter if cones are the pixel in your eye then rods are like the back-light." This implies to me that that either: 1) We are only capable of seeing 2 million different colors, or 2) We are limited to only seeing up to 6 million pixels It's not very clear what the point he was trying to make was as both of those points are flawed.
@@walterg74 Walt If your perspective is that people must be correct based on subs and views, you most likely won't live into old age when you trust that life hack video that tells you to plastic wrap your food before you cook it... Do yourself a favor and think for yourself. This guy might have a credible channel but this video is absolute bull.
The editing on this was great. That being said, the video didn’t age that well tbh. I think this guy underestimated the progress of technology and how fast it really moves.
You can definitely see a better blend of colors due to more pixels. Also look at a face on a 1080p television and look at that same face on a "4K" television with 4K content. You can see much more detail, filling in the face, on a 4K television (again with 4K content). The difference is blatantly clear.
Yes i do. Now stop making false claims. Human eye focus resolution is multiple times higher than the standard 4k resolutions and thats completely ignoring your peripheral vision.
Also of note: the so called "perfect vison" is not perfect. Its just a bare minimum standard considered acceptable before correction is needed. Majority of people in the world see significantly better than that standard, especially young people.
Yes, in fact investigations and research are revealing that the pixel resolution of the eye may be as high as 10000 x 10000 with frame rates approaching 1000/sec!!! I mean just do side by sides of 1080p and 2160 4K UHD... its mind blowing... which blows this guys pseudo science out of the water.
randomly found this trash video[haven't watched it yet]/ and happen to see a lot of but hurt comments and jokes.. well im her to say one of my eyes used to have inhuman vision,but only when i closed the other. when i did, i could see the eye chat[you know the one with the big R etc] .. the doctor person said i had 20/22 vision
Games dont "come in 4k". Also Bluray discs don't contain uncompressed 4K video like you seem to suggest. 4K Bluray discs have been around for a while. Its like this video was recorded in 2012 and you forgot to publish it.
Battlefield 1 Destiny 2 Divinity: Original Sin 2 Doom FIFA 18 Fortnite Forza Motorsport 7 Gears of War 4 Grand Theft Auto V H1Z1 Halo Wars 2 Hellblade: Senua’s Sacrifice Law Breakers Middle-earth: Shadow of War Nex Machina No Man's Sky Overwatch PlayerUnknown’s Battlegrounds Prey Pro Evolution Soccer 2018 Project CARS 2 Resident Evil 7: Biohazard Sniper Elite 4 Tekken 7 The Witcher 3: Wild Hunt Wolfenstein II: The New Colossus apparently all of these games do
@@AndrewBouchierUK Well no video game really "comes" in any resolution. One that is made in a smart way will scale to an infinite resolution if you have a powerful enough computer for it. The games you listed simply market the fact that they can (like pretty much every game made in the last 10 years) be rendered in 4k, should the user wish to do so.
The difference between a 4K Bluray and the 1080P Bluray that also comes in the box on a 65 inch screen is 100% noticeable. Watching a film that is actually shot and mastered in 4K is is miles better.
Connor Ross Right, and it's also important to mension that 4k BluRays are bitstarved (They usually use twice the bitrate for a resolution with 4 times that much pixels, while HEVC is just +20-50% more efficient than H264, not +100% like they promote it), so there's still room for improvements without even raising the resolution.
+1 I notice it immediately too. The thing is, the center of the fovea is very photoreceptor dense and for many people they can tell the difference between a high quality 4k moving image and that image downsampled to 1080p. It's just that it's the invididual spot you're focused on within a moment that you'll pick up, not the entire screen at once which is one of the major flaws (out of many) of this RUclips video. Most 4k Blurays also give us an increased color gamut, and that's also immediately noticiable by the majority.
Higher framerates are better not because you can spot every frame but because it just feels smoother and input lag is lower, having said that I still stick with 60fps
it feels smoother, because the interval tightens, and makes resyncing with the screen faster, because the interval is smaller. However, you could put effort into making steady frame interval, instead of pumping effort into higher frame rates. A pro gamer, could perform better on 45fps, if it ran with solid frame intervals, than on today's 120 fps, because he would be able to trust his timing and perception even more, and thus have more success intercepting movement, for shooting moving objects. That is also why, a movie played on a solid medium does not feel studdery, but a game on pc, running at 24fps feels really bad. The brain will perceive the smooth motion way better, if the playback is solid.
@Martin Panda: "A pro gamer, could perform better on 45fps, if it ran with solid frame intervals, than on today's 120 fps, because he would be able to trust his timing and perception even more, and this have more success intercepting movement, for shooting moving objects." That's actually demonstrably false. There are plenty of games where you could pick a resolution 'low' enough for the game to run smoothly at 120fps basically all the time. Not only that, but in online gaming a lot also comes down to having a videogame programmed well with proper network code and fast enough servers. Fact is, smooth gameplay on a rock solid 45fps, versus a rock solid 120fps will *always* result in smoother play at 120fps. There's no debate there. If you'd knew pro gamers, you'd also know they prefer 120fps over anything lower. Last but not least, by your 'logic', a rock solid 24fps would perform just as well as rock solid 45fps, which is simply not true.
You think it feels smoother. As long as your video card can render 60fps, and the rest of the computer can feed it 60fps, that's (mostly) what you see. If you watch a DVD on your computer, is it blocky and laggy? That is the good way to demonstrate what your screen can give you. We get upset when we're gaming, and in huge battles we see the blockiness. That isn't because the screen can't do it. It's something about the game. Any computer based fps display isn't necessarily showing you what has gone through the video card to your screen. It's what should have been rendered. So sure, a complete system optimized for 120fps will look "better" than 60fps, only because it's satisfying all the demand for 60fps, from the game through the light emitted by your display. When 120fps degrades to 90fps, and all the way down to 60fps, it's still satisfying your 60fps need. When it degrades more, you'll see it. A good system and video card satisfy that, without trying to bankrupt yourself trying to reach the 120fps mark. We have a lot of technology now that depends on your visual acuity only being 60fps. Sometimes it's only 50fps. Like, some people can see lights flicker and get headaches when the local power grid is only 50hz. Not everyone, only a few. You still don't notice things like multiplexed segmented displays flickering at you at 60fps, but it (sometimes) shows up on video recordings. A lot of affordable 120hz screens are really 60fps with a 120hz pulsing backlight. It lets them advertise "120hz", without admitting they're only 60hz. Most people won't ever figure it out, unless someone shows them the detailed spec sheet.
Smoother experience is also dependent on your video processing and communication between hardware. and yes a higher framerate or Hz, does mean smoother. That is why 4k monitors struggle to reach a good response time and high refresh rate.
Exactly. You want a screen that is beyond the performance of the human eye. That way, what you see is indistinguishable from what you would see naturally. You won't detect any display artifacts from having visual acuity better than the screen you are looking at. There is more to what you see that simple pixel counts on a screen.
You forgot to mention one of the real reasons to get 4K. People often times upgrade the size of their TVs. Once the TV gets large enough, the pixels become 1) too large or 2) too spaced apart. This is obviously affected by the viewing distance to the TV, and like you said, the average consumer is probably not buy a 70+ in TV, but still and important fact. This effect is also noticeable with new phones, depending on the display resolution. When upgrading to a new generation phone, looking at the difference in the display, it can become obvious just how much you actually can see the pixels on lower DPI displays.
Darn it I went too big, can still see them on my 48". ;) Obviously this is more in games though due to the huge contrast of the HUD, movies is harder to tell due to the colour transitions being softer.
I think one point is missing which is pixel density. The higher resolutions look more crisp the larger the display. 8K might not make sense on a 60 inch TV, but might be more effective on a 6 foot wide home projector. Right not that’s not for the average consumer but I can see projectors becoming a popular format during the growth of 4K.
Yeah, sadly this video just scratched the surface. The 4k discussion is also about compression, screen size, pixel density, viewing distance, processing power etc. Things that can fill a whole video series
The density/pixels per inch/ppi was what I was thinking right away, especially smartphones. I remember laughing when I saw a 5" smartphone with 4k resolution. The ppi in that is just ridiculous, and seems like overkill, even when considering the closer viewing distance. But to be honest, I'm not completely sure. After I was taught in school "Humans can't see more than 24 fps", and "Humans can't hear the difference between mp3 and FLAC", I'm just super sceptical about all of these "Humans can't..." claims.
Yes... but no. Home projectors are not a thing because you need to be so far away from a screen that big to see it in it's entirety. Which means, that the pixel density is not remotely that important because you're sitting so far away. Which is the whole point of a theatre. Most projectors in cinemas are 2K...
@@GabrielAlves-lp1qr actually there are more theoretical colors than colors that you can see/find in nature. Color space has nothing to do with resolution. What are you talking about lol
Nonsense, I've made jumps from 30 to 60 and eventually to 144hz on monitors, while I have certainly noticed the Upgrade, EVERY SINGLE PERSON THAT STEPS FOOT IN MY ROOM IS BLOWN AWAY AT HOW SMOOTH AND CRISP THE MOVING IMAGES ARE. Enough of your pseudo-science, of course nobody's eye sees in frames and pixels, that doesn't mean we should just settle for a 1080p 60hz screen. Cut the tech-shaming
The real MacGyver He’s right though. It is a ridiculously huge difference. The problem with the argument against is that the studies don’t actually test the perceived differences in fluidity and smoothness and instead will test individual frame recognition where they flash an image on a single frame and ask you to identify it. This isn’t an accurate test as you don’t see frame by frame, you see the sum total of the frames and then perceive motion out of it, the more frames, the smoother and more realistic it appears. The actual upper limits to frame-rate is over 1000fps but 150-200fps seems to be a nice sweet spot.
When I was playing on my 144Hz Monitor for the first time, I was laughing hysterically because I was so amazed by it :D When a friend of mine also upgraded, he said he couldn't see the difference in games but in these online 30fps vs 60fps vs 144fps slideshows he could. Only person that I know that "can't see" 144Hz. weird
It's kinda fun coming back to this video years later. Also a lot of what makes your vision good is your brain. Most of what happens in your eye doesn't matter. It's just about getting enough information for your brain to upscale it properly lol. Though if your eyes get worse your brain eventually can't compensate. So there's that.
Its a marketing gimmick, so no it truly will never happen just like 4k. Even if it did we as human beings wouldn't be able to detect it again just like 4k.
@@Openreality you really believe this moron in the video? there is so many dislike for a reason, you dont need to believe everything you see on the internet you stupid moron if you see a 1080p and a 4k display side by side you will be so surprised, and if you put a 4k and 8k screen side by side you will still be able to see the difference.
@@Openreality of course it will we will need it for vr. They are already making vr with 4k screen because otherwise you see pixels. They would even need more as 2x 8k screens would not even be enough.
its not about the resolution, its about PPI. so at a phone size, yes ur probably right, you cant see 4k. however, on say, a 84 inch display like the one I have, 1080p looks like trash if viewed from closer than 10-15 feet. Even from that far away it looks noticeably lacking in sharpness. THIS is the point of higher resolution.
@HEAVY SYSTEMS, Inc. ok, so what about VR? Pretty sure DPI will still hold true, but what resolution should be the go to or industry standard for 4" dual screens 1" away from your face?
noticeable improvement? Yes Looks like trash @ 1080p? Either you're lying, or you're a super privileged bitch who has set the bar waaay too high, or your TV is trash, or the media is trash. Because a good 1080p film should look just fine on a screen your size. Stop being a baby.
Jack Hutchison True, but usually they continue with 4K just because of how simple the Arri Alexa system is, at least compared to RED DCs offering. Still, movies are the only medium where 35 mm film continues to be used on a large scale, so that's something as well
Steve Jordan The only reason to shoot or edit in 8K is the same as it is for 4K: Less fidelity loss during capture and editing; however, unless you're producing content to be shown in a movie theater, you're likely going to be targeting it for either Blu-Ray or HD streaming. Even digital projectors in movie theaters are only DCI 4K (4096x2160). 25% market saturation in televisions is low, especially considering 90%+ of all video content is being consumed on a phone or tablet that's not even full HD. This means that your 8K content - that you filmed on a $100,000 Red Weapon camera (taking up a half a petabyte of data for a 2-hour production) and then edited on a $100,000 editing bay with over $30,000 in hard drives alone - when uploaded to RUclips is being mostly watched at 720p anyway.
The problem of digital movies is not the format on which they were filmed, it's the resolution of the master used for post production. A lot of movies shot in 4k actually have a master of 2K because, until very recently, it was a lot easier to manage in the editing, or vfx process, and also much cheaper. So there's a lot of 4k movies that are in fact 2K masters stretched for 4k blu ray, even if they were shot in 4k in the first place ;) Best exemples are Pacific Rim and Gravity, with 2K masters, that were stretched (in theatre the stretching was even worse because they showed these in IMAX^^). So if you are buying these 4K blu ray, you are only paying a better compression, but not a bigger image than standard Blu Ray :)
Not all movies are shot widescreen. Some movies are shot in what's known as "open matte". This is when they shoot on 4:3 film and then crop it to the widescreen aspect ratio in post. So in some cases, if you watch the fullscreen version of a movie, that's the one with the uncropped image.
He didn't even mention the most important thing to consider when buying 4k Monitors/TVs: Pixels per Inch and Screen Size. I use a 4K 40 inch TV as a PC monitor because the PPI is 110 instead of my old 1080p 40 inch TV at 55 PPI. At the close viewing distance I had it at (maybe a little more than 1.5 feet), the pixels are very noticeable/ distracting when gaming at 1080p but not a problem at 4K. To get the same pixel density of 110 PPI at 1080p, the screen size has to be 20 inches LOL, not nearly as immersive.
I remember enjoying this video when I first watched it. Now I see that the information in it is largely opinion being presented as undeniable and well researched fact, and often wrong. Makes me worried about the other videos on this channel.
@@XiahouJoe His predictions were inaccurate(because he underestimated the power of advertising and the gullibility of the average consumer), but what else do you think was inaccurate? P.S. it should be noted that a lot of the information here can also be found in several Vsauce vids as well.
@@InsomniacPostman I should note that you tube keeps deleting my comments citing links showing hes wrong and people can see the difference of 4k vs HD example though search term "Can people see the difference between 4k and HD" first article
@@XiahouJoe I've had a 4K monitor for the last 7 years and, other than a reduction in motion blur, not a single 4K video or game has looked any different from a 2k one. Not to mention that most "4K" films are actually 2K upscaled to 4K so all you're getting is a lot of unnecessary pixel doubling.
200 FPS on a 60HZ monitor actually feels smoother because the GPU has a more dense, and therefore more accurate, pool to fetch frames from. Also, the game has greater accuracy with gathering input data. Instead of it gathering the data within those 60 frames it can gather it within 200 frames, increasing the accuracy. So no, it isn't just an illusion.
8:45 Have you ever wondered why virtually noone actually uses the programmed motion blur? Because it looks incredibly unnatural which creates even more motion sickness. It also adds a ton of input lag which makes the game feel unresponsive and creates even more motion sickness.
Philip Martin That's wrong. It looks unnatural because it blurs all the wrong things. Motion blur should only be applied to things that move with respect to your retina. The game assumes your eyes don't rotate and that mouse movement is your head movement. If your eyes follow an object moving across the screen you expect perfect sharpness, but you will see a blury mess. The reason motion blur is tolerated in racing games and the like is just that the motion of the ground near the camera is hundreds of pixels per frame and we don't have 1000 Hz displays yet to do natural motion blur.
i find a very light use of motion blur enhances my immersion and ofc it should be done right (not blurring the background and only the object that moves, just so lightly that it is almost barely noticeable)
Modern motion blur doesn't introduce input lag. Motion blur used to average a number of frames which could introduce input lag, but modern motion blur uses the relative speed of objects and a post-processing shader. There is no lag at all, assuming the shader doesn't lower your frame rate.
Philip Martin Interesting stuff that thanks. I think all the talk of resolution (in the digital number-of-pixels sense), frame rate etc comes from just assuming the eyes and brain are analogous to a camera and computer... Which they're not in any way beyond a very simple metaphor. The eyes and the brain are more like an analog system than a digital one in most ways, but not that similar either. It's almost like now we're viewing the world through a technological lens that we overly compare biological systems to technological ones. In the brain or the eyes there isn't a single piece of metal bigger than a few atoms, there isn't a single switch or transistor (like we use them electronically), signals are carried only chemically, and much more. Another example I keep thinking of is memory. Memory isn't some grid of cells somewhere - there is memory *at every level*, it's memory all the way down. "Memory" (past events leaving a mark and able to affect the future) is present from the tiny chemical level all the way up to the physical organisation of cells. An event induces changes in the concentrations of various ions, proteins that can change shape, changing levels of gene expression and chemicals released obviously stick around a while (depending on which chemical it could be from nanoseconds to hours or longer). And that time difference in how long - say, a hormone release will have an effect for - shows another really fundamental important thing. In all computers and anything digital, time is made up of discreet sections or chunks, like the clock cycle of a processor or the frame rate of a camera or display, or the sample rate of some audio. A processor cannot do anything in a shorter time than that, and everything it does is made up of integer multiples of that time. It certainly isn't possible to have continuous time with that type of technology. Furthermore, the intensity of something has to be made of integer multiples of something, which again it doesn't have to be in organic systems. I've barely scratched the surface, but think that's enough to show how different organic senses and processors are to digital ones.
"Wont be mainstream for another 5 years" *A year later and every supermarket and electronics store that sells televisions has their shelves completely lined with 4K TVs* 4K is in and 8K is already on its way with a 32" monitor from Dell that launched at $5000 and has already been cut in half with some being sold at $2500. Granted that's still crazy expensive. The point is it's approaching faster than anyone realizes. The importance of resolution is entirely dependent on how big the screen is, and how close your eyes are to the screen. The bigger the screen, or the closer your eyes, the smaller the pixels need to be for you to achieve an equally sharp image. Take VR headsets for example, their tiny screens have resolutions higher than 1080, but since your eyeballs are mashed right up against them, you can actually see individual pixels clearly. You'd NEED to achieve 8K resolutions in order to get an image as crisp as a typical HD television that you sit far away from. Also, in response to the whole "remember when you thought VR was going to be a thing?" Yea, it was and still is a 'thing' you goob. Sure, it still hasn't broken out into massive mainstream appeal, but the technology is still being improved.
Your reply really depict the notion of "Bandwagon". Mainstream he meant was not just buying 4K TVs but also having 4K contents to the average consumer to consume without spending extra. Most consumer only have 1080i from their cable providers, heck some channels still provide 720p. So just because you go to Walmart and buy a 4K TV doesn't translate to you consuming 4K content for the average consumer. Also know the difference between Upscaling, Pixel shifting and Native 4K.
Personally I've never been able to detect the difference between any resolution higher than 1536p(the maximum resolution on my Sony Trinitron monitor), and at that resolution I personally think my monitor actually pumps out a superior image to the LCD due to its superior color depth. As far as framerate is concerned, I don't detect any difference above 70 hz or so. As for VR, that technology is "almost" mainstream now. The Samsung GearVR is probably going to be the device that makes VR mainstream - that or the Occulus Go.
Personally I think the only reason why 4k became a mainstream format for TVs is because it has a higher number, which marketing can always spin into an "its 4000% better than what you have now" argument. In reality, though, content delivery is still at 1080p and will likely remain that way for some time.
@@johnrickard8512 people with ps4 pro and xbox one x are justified in buying a 4k, and those that watch uhd movies, anyone using it for pure tv has been sold a lie, although they are future proofed but probably paid more for it.
A lot has changed since this video released. 4k 120hz tvs are becoming more common but still gaming at 4k 120fps is not achievable for the average gamer. But we are getting closer to that each year. For now, I am happy with my 1440p 165hz monitor. More affordable and easier to run than 4k but still looks crisp
165hz moniter is pointless, though. Anything above 144hz you wouldn't notice a difference. I understand why you got a 165hz moniter it's actually easier to get than a 120hz to 144hz monitors because they are raising the standard. However I feel bad for people who brought a 240hz+ than downgrading to a 144hz because they know there is no difference alot of youtubers done that.
@@TheAazah yeah unless you are a professional tournament gamer and need every frame and millisecond for quick response times, its not worth spending all that extra money for 240hz. I personally don’t notice a difference from 144-165hz and there is only a handful of games i play that max out the monitor. The biggest thing I did notice was the crispness of going from 1080p to 1440p. The LG monitor I got just so happened to be 165hz but i dont think i will actively look for anything more than 144hz in the future. I think we are a few years too soon before 4k 144hz is common for the average gamer
@@TheAazah This is untrue. The limit for human perception is closer to 1000Hz, although that is usually for flashes of light, and not necessarily framerate. I can notice a difference all the way to 360Hz so far (there are no panels above 360Hz that are available right now), though the main benefit is reduction of motion blur. It's less "seeing more detail/information" and more moving objects reach the point of zero motion blur, and it's all up to you after that. I'd recommend watching OptimumTech's video on the XL2566K to see what I mean. For tracking heavy, fast movement games, it's a pretty stark difference. *Addendum: I mean panels that can properly handle high refresh. Only the XL2566K and PG27AQN (both of which I own) can handle 360Hz correctly, all of the first-gen panels cannot. There is a massive difference in "fast" 240Hz and "fast" 360Hz panels vs. their first-gen counterparts to be wary of.
What was your first memorable bluray? Do you plan on upgrading to 4k?
Also... anyone know why the blood field you're constantly looking through is blue? The answer will be revealed in my color video!
And for the worried... I was putting sunbutter in Wheatley's mouth. It's totally safe but... maybe a little annoying. :D
KnowingBetter more videos please
+Ricardo Garcia More coming this weekend, apologies for the delay!
I would say Bram Stoker's Dracula was my first Blu Ray viewing, it looked far superior to any DVD version and I thought the DVD looked good. This 4K Blu Ray thing is a cash grab and nobody that I have talked too about it gives a flying fuck about 4K UHD, they are the next CEDs or Betas. They won't sell very well and will die out, that's what I predict......not enough interest in them so far. If they do get embraced I'll be very surprised.
What did you think of the picture quality when you first saw it?
My first memorable bluray has gotta be Dark Souls on the PS3.
I've worked in consumer electronics since 1997. I have learned the hard way to stop using the word 'never.'
The best life advice justin bieber has ever given me
Lol
never use the word never
no joke, i would like to hear some of those stories and "predictions" you made.
@@SpahGaming ?
11:02 Jokes on you, I’m watching in 144p, so the pixel wasn’t even there.
Me too. Feels bad man.
its there
Tested it it was barely visible
360p NO dot, i even checked with a 60 x magnifier!! Went to 1080p, the dots there!!
it was there you just didnt see it
Humans don’t “see” in a quantifiable resolution. At least not one that’s comparable to a digital output resolution for a display. The main appeal behind 4K is not just the resolution, but the size at which you can display that resolution and still maintain an acceptable degree of definition
Exactly. It is arguing apples and oranges.
exactly, now 4k on a phone or laptop that's just stupid
@@Ik0As 1440p looks amazing on my phone, so I think 4K would look amazing on a laptop.
Now pixel density actually matters
You want a 24" monitor? Yeah 4k is not necessary. But on a 55" TV? Then it makes a difference.
We were filming in 4k raw in film schools in 2017. However, most of the time you compress down to 1080 for distribution. The main advantage of filming in 4k before compressing to 1080 is that you can make more corrections in post before noticably affecting the quality of the film.
This is the same reason pro cameras often shoot in 15 megapixel or higher (some as high as 50 megapixel). Besides allowing you to produce huge wall-size posters that stand up to detailed close examination, it gives you tons of room to crop in post, and to do edits like object removal more cleanly.
@@chiaracoetzee The benefits in the editing room are nice but the biggest benefit comes when printing on 70mm film, which can have a horizontal resolution equivalent of up to 12k (estimated; depending on pulldown direction, perforations per frame, stock quality, etc)
@@chiaracoetzee same reason pro audio is 30+ bit at up to 196kHz, even though you cant hear better than the standard 16bit 48k used for distribution. It gives us headrooom for processing and avoiding aliasing
Makes sense. The more detail you provide for yourself in the raw materials, the more nuanced editing you can achieve.
in modern day you always export to 4k for pro use.
Crazy how much technology has advanced in just 3 years
Yes, it’s crazy
Yeah, 5-10 years lol
@@brenthawkings4761 for real, some phones have 4k screens now (sony xperia 1 mark ii)
@@mundaayn why though? Okay I highly disagree with this one lol 4K on a screen as small as your phone.
@@MK-bi1hj true, i just wanted to show the tech advancements, but personally I think even QHD is overkill for a phone
"You'll never get a TV with pixels the size of atoms."
Who's here laughing in 2095?
2100 pixels are almost non existent
You guys are still using pixels?
@@archungus Wait until they come out with GrapheneTVs or for short GTVs.
Aww man, I'm just from 2077.
LOL
What I learned: fighter pilots take cocaine.
Not coke, but they typically have an adderall or some other amphetamine with them to pop if they need to on a long mission
Also the Nazi armed forces, not known for doing things lightly, were rather liberal users of methamphetamine
@@brynclarke1746 woosh
Want a random fun fact about a Vietnam War Jet Fighter?
@@Tigershark_3082 yes
@@ianthompson2802 The Fairchild Republic F-105 Thunderchief, despite it's fairly not good-looking combat record (more than 300 of the 800 built were destroyed in crashes), they managed to pull a combined total of 27.5 kills against enemy Mig-17s. 3 were from the use of AIM-9 Sidewinders, and 24.5 were from the use of its internal M61 Vulcan 20mm rotary cannon (the .5 came from a shared mig kill between both an F-105, and fellow F-4E Phantom II).
Everyone forgets how amazing 1440p is.
Absolutely QHD/2k/1440p is the perfect resolution. Shame Sony haven’t added it to the ps5.
1440 suck honestly, as its own standalone resolution its great, it just doesnt scale well at all with 1080, thats the nice thing about 4k. It scales perfectly with 1080p and similar resolutions.
@@killingtimeitself yessir
My old Galaxy phone was 1440, and it was amazing device to use.
@@SynthiaVan and thats at the size of a phone screen, where pixels per inch are substantially higher than normal.
"Unless you're on cocaine or something."
You say that like there's a chance I wasnt
Ya I always see in 240hz
Factual
"8k will never happen"
Samsung: allow me to introduce myself
Also, Nvidia would like a word
16k is already a thing lmao
@@clownemoji2153 yeah, but there's pretty much no good way to play it.
canon eos r5: allow me to overheat myself
@@MK-bi1hj a 3090 could upscale to 16k pretty effectively, and if you've got 16k tv money, you've got 2x 3090 money too!
I see in 40K, and I'm seeing some serious heresy...
@Gaming time lol, he was joking.
@Gaming time wtf are you talking about
@Gaming time Go back to Arabic RUclips then.
@Gaming time
Arabic RUclips? Oh, you mean Goatshit. Jk
Drake Kocjik is that fucking you
And here I am watching this on my 8k TV.
4k is very important for filming because you can make close ups in post production without losing quality
Hey man, I work in the visual effects industry. I want to correct something you said. Most of the work in television and film is shot in 4k or higher these days. Pretty much every professional camera does it. Dp's want the larger number of pixels work with in post, as you can shoot wider and crop, or add in visual effects and maintain higher fidelity than if you did the same with 1080. And yeah, I've worked with 8k footage as well.
It is ironic because we used to work in lower formats like SD in post production and upres HD at the end. Now it is the opposite. Computers are fast enough to edit in high resolutions, cameras shoot in higher resolutions, and then the show is brought down to 1080 for the marketplace.
Lol, that's awesome. I upgraded my gaming equipment to record 4k for the same reason. Not because I think that 4k is better, but because sometimes I can make a better video for my youtube channel if I can crop the screen and not lose any quality. It lets me functionally zoom in to get angles that would otherwise be impossible.
Yeah and i always look for lesser quality so i dont get sick, even had to find a used tv to watch stuff idk what ill do when regular hd isn’t available anywhere. Trying to play games on a xbox one x is terrible unless I plug it in an old standard hd tv, i keep it set to 720 and do great, dont feel like i miss anything and thats with or without my glasses. 4k is just too “ fake looking”
rotoscoping on lowres and low shutter footage is an absolute bitch to work with. like we can add that "motion blur" from a matched shutter speed, Its just so much nicer to work with footage that was shot on a high like 1/144 shutter, MMMMMM clear crispness. I can dirty it up for the people who like that "wave you hand in front of your face blur" thing they're obsessives with
you can distinguish 4k from 1080, on a big screen especially
* barely.
Maybe if you don't look at 4k commonly. There is a noticeable difference.
* barely. Text? Sure. Basically, applications when you're sitting right in front of the monitor. I am sitting in front of a 4K monitor as I type. Have to turn my head to see the other side of it. It's more like having two 1080P screens side by side with no gap.
It would get more important the bigger screens get. There is a reason 480p looks clear on a small monitor. There will probably be a point where 1080p on big monitors will start to look unclear. Think about the future.
Only on a big screen. And that's only because the 1080p won't look as good at such a big size.
Watching this 5 years after it was posted. You nailed the 4K popularity timeline, KB!
8k will never happen.
2020: *allow me to introduce myself*
well 8k already existed when he released the video, i think what he actually meant that 8k is stupid.
I somehow doubt that’ll be what 2020 is remembered for...
more massive adoption without the forced gotta buy it now of manufacturers will never happen. Why did nobody actually pay attention to the video.
8k is probably only gonna be in theaters.
You need a 100 inch tv to notice 8k. 16k you need a stadium jumbotron.
8k will be useful for VR which benefits from extreme pixel density due to the screen being stretched across your entire field of vision.
VR is an even bigger fad than 4k monitors.
I don't even know anyone who actually games in VR.
And I'm saying that as someone who used to develop VR games.
The company that I used to work for was one of the early adopters; we received Oculus Rift prototypes before VR gaming was a thing.
It was cool but I immediately realized how much of a fad this was. People these days play games where you can prove your expertise and skills.
Games that tend to be extremely competitive are the ones that are the most interesting to watch and play.
Shooters and MOBA's are very good examples of this phenomenon. VR gaming is basically a "prosthetics" simulator.
You play in a virtual environment and the only way to interact with it is by means of prosthetic limbs. It's awkward at best.
Sorry for rambling, I just had to get that off my chest.
Have a nice day.
Simulation will switch to VR once the quality is there.
AntonioKowatsch yes its years decades away from a feasable good experience right now its a noveltg and amazing for porn
you seem to forget that gaming is a hobby as well as a sport, old people will tell you of when they thought rock and roll was just a fad....
AntonioKowatsch VRChat has like 3 million downloads
“Every single person in high school right now was born after the matrix came out”
*laughs in held back*
lies! has it been that long?
damn, Bill Nye got held back
it's sad really .... so many people gonna call that movie a classic... when really it (and the whole series ) was just visual garbage that borrowed form every thing with nothing really original in it.
Denver Starkey it’s a classic
@@saucyx4 and there in lays the problem with the younger generation. calling that movie (the matrix) a classic is just shameful. the concept is ripped right form terminator , and the action sequences are ripped right from the stills of anime. the character development is really non exsistant for any character outside of neo himself. and his development is a little thin and litterally happens entirely in the first movie. looks i'm not saying it's a bad movie. It was an enjoyable movie. but it lacked originality and depth. it just doesn't deserve to be veiwed in the same vein as movies like E.T. or terminator or jaws.
Came back to watch this 5 years later where 4K is the norm and totally streamable, 1080p is what your grandparents use, and 8K is the new "Nobody can tell a difference!" Now you can pick up a cheap 512GB thumb drive for $10 and with modern compression you can fit a half dozen 4K videos on it.
You tell e'm mate
It's funny, I wrote this guy off after this terrible video all those years ago but then he started doing history content and it was actually good.
And our smartphones now have an option to film in 8k resolution... Our dashcam footage at 1080 vs 4k would like to have a say in this matter.
4k is not the norm outside of gaming.
@@MelkorTolkien and streaming and cinema. Other than those things, no.
144p gang rise up!
little face missing a zero
Adrian 0144p
Nah 072i
@@seamusmckeon9109 1i
Ima 360p guy 480p is too good
Filming is 4K can be useful if you are planing to downsample your final output to 1080p. This offers video editors the luxury of 2x digital zoom with no loss of resolution. I used to film in 1080p and render in 720p all the time. That 1.5x digital zoom was critical for making crisp looking virtual camera movements.
most 4 million + youtubers film in 8k ;) cinema is often filmed with dual 8k arris or red cams.
What?
this video is about consumers, not producers.
"Filming in 4k" -> Producers.
Filming in 4k also increases temporal resolution in general, so a much cleaner image is the result. This is also true for gaming.
there youtube, i finally clicked on it. now leave me alone
Lol same.
Was it as stupid as you imagined it would be? This guys a dumbass.
I was looking at your comment but it was 1080 60 megahertz 5K and which I could not be read due to the resolution in pixels that are deviated from the 6000 BTUs that I'm used to
Genious
Sosa Kun , why do so many people make this comment on RUclips ? Is it a special club you are in ?
Anyways the more you click on suggestions the worse it will get .
It's 4 years after this video and I'm still only just switching to Blu-ray and still buying movies from Redbox
"You cant see the pixels"
Isn't that the point? I don't WANT to see the pixels lol.
After a certain point, it's diminishing returns.
Its awesome how you just pass all the explanation to make a shity coment
@@omitorrent7492 , I did watch the video. I was just commenting on one thing he said. Calm tf down.
@@omitorrent7492 are they supposed to make a sentence for every single thing he said
Ignoring Omni Torrent...
...Ideally you don't want to see the _dead_ pixels. Mild pixelisation is usually tolerable, but having visible dead pixels can really ruin the enjoyment of a screen.
1080p still looks great but 4k looks super sharp, there is a decent enough difference.
@unrepeatable raddish Which is all relative to the size of your home, buddy.
Displaying 4k on a 480i CRT can cause some major headaches.
@unrepeatable raddish8k currently is pointless unless you have one of those really really huge TV's (so not talking 55" here)... but if those TV's get the norm then 8K will be great.
The decent enough difference is mainly on 4K-specific marketing made super color videos. Once it becomes part of home with regular input devices (streaming, BRay, etc.) it becomes a very expensive object side-by-side with a good-old 1080p. The big differences I can sense is contrast and light, typical for a new TV device. They do get brighter, sharper and with higher contrast. But that is not 4K specific. Regular consumer should not spend the savings.
@unrepeatable raddish and viewing distance
35mm Celluloid Film (Which is what was usually used to film movies pre-2005) can digitise up to 16K just so you know. The true definition king is a 35mm projector at 25-70 frames per second. Plus then you have Panavision Ultra-Wide Screen which is 70mm (used for Ben Hur, Laurence of Arabia and The Hateful Eight).
I had a teacher who drove 4.5 hours in a blizzard to see a one night only showing of Lawrence of Arabia in full 70mm. His take was that even though you can get access to that now easily, it was worth it to see it before his eyesight went bad 😂😂😂
It’s worth mentioning the image size of 70mm movie film is roughly the same as 35mm stills film, because of the direction change.
That’s also where 4:3 came from, as stills were 3:2 but if you split them in two and rotate them you get 4:3.
And also that film grain was larger when 70mm motion pictures were more popular, a more recent 35mm motion picture has about as much resolution as a 70mm film from the heyday of 70mm. Of course all 35mm film from before that is much grainier, and indeed tops out between 4 and 8k instead of 16k.
Now that the content finally exists in good quantity I am a 4k UHD Blu ray believer. The hype is real... sometimes... if the transfer is good.
I watched this and then checked his channel looking for the “I was wrong video” listing how readily available 4K content now is on Netflix, Apple TV, etc; how UHD blu-rays exist and growing in popularity and how HDR and other features make 4K tvs far superior.
Or how increasingly higher resolutions in video games increase the rendering resolution and shows more detail in game and let's you see farther, but keep in mind, this video was a year ago.
"This is 1080p" joke is on him, I'm watching in 360p.
Shameless edit: Look on my channel for rubbish gaming clips
144p. Get on my level.
@@joelharber2100 and I thought my Internet was slow!
Me2😂😂😂😂😂
Joel HARBER I’m watching in Minecraft. Beat that!
Joel HARBER my video won’t even load. Step your game up. I have no idea why we’re even talking about this btw...
Here's the deal, though: Your fovea, that all-important center of the eye that can see the finest detail, could be looking anywhere on the screen depending on where the viewer's attention is currently. So you have to design the whole screen uniformly and assume the viewer can be looking at any position at any time. It doesn't matter so much whether the TV has overall more pixels than your retina or not: It matters whether or not the TV will look crisp to your fovea.
No, you do not design a TV based on the idea that the fova, the most important part of our eyes and the part that will definitely notice a lack of detail, is irrelevant. That would be stupid, basically like most of this video.
Would be interesting to see a screen with eye tracking used to bring into full focus only where you are looking and save bandwidth with the rest of the screen.
Certain VR headsets are implementing this to have higher detail and lower processing requirements.
A family with three kinds, and guess, guests for the weekend...
Adam Savage I think some games are doing that to boost performance, but I haven't heard of TVs doing that. Again, TVs don't know where you are looking, and even in cases where a TV show or game is intentionally doing some blurring, the TV itself is still gonna have the same resolution everywhere. That said - you are likely correct as far as games go. If a game can blur part of the image where they don't expect (or don't want) players to look, they can reduce a bunch of detail in the blurred areas, which would improve the performance of the game.
Adam Savage Well yeah what you describe with the textures is part of what's done as well. There's actually a lot of tricks behind the scenes of any game. As far as variable frame rate, there's FreeSync (AMD), G-Sync (nVidia), and HDMI 2.1 (hopefully everybody, but adoption is slow).
I honestly hate motion blur in video games and I always turn it off if I can. I find that the "faked motion blur" actually makes me feel ill. I find it interesting that the Soap Opera Effect can make people feel ill.
It’s fascinating how differently perception works for different people!
I hate the Soap Opera Effect, I can stand projectors with colour wheels, and I can’t bear looking at backlit LCD screens without protective glasses for the blue light!
this is so misleading.The number of pixels in an image as nothing to do with the capability of the human eye to see one particular pixel. It has to do with the size of the display in which you will see that image. Suppose you take a picture of a person with a camera with a resolution of say 640x480 pixels. If you were to see that same image in a 43" inch tv you would see that image very pixelated as if you were to see it in and 15" screen monitor that image would seem ok. With the advance of technology and the increasing size of displays it is necessary to capture a picture/movie with as much pixels as possible so you can see them fine in a big screen. And yes as of 2018, 8k is coming to public houses. You've made a prediction that 4k would took 5 to 10 years to be used by most people and you're way off that reality.
yea.... the funny thing where I used to work we were actually showcasing our prototype 20k displays and that was almost 3 years ago (20k displays for showpieces in large malls/venues)
it was kind of just showing it to our customers what we were capable of.... fun part was playing half life 2 on it... the bad part >> getting yelled at by some higher ups because we were playing a video game on it...
reason for it was due to not having access to the needed 20k footage to put the demo up... we were the guys who worked on site and tested all the racks before they got shipped out... we were told to get it set up due to a drop in by some regular big customers and they wanted to impress...
after a few hours and not hearing back from the people that were sent to onsite locations we took it into our own hands to do the half life 2 lost coast demo (we set it up in house to showcase the ability with our software and expertise that you could run essentially any graphics on the fly with sound matching what is happening on screen in realtime... we did that for 4k-8k screens and upscaled it to 20k)
it was odd that we impressed the customers after giving our explanation of the demo and all the applications they could use our system for... but get in trouble with upper management because we didn't use the regular demos (our demo was done by our own area for the tours that regularly walked into our area that were for customers... it was fun because the people visiting at our area were normally the technician guys that would be working our product at their venues and geek out and what they could potentially do)
also.. sad to say true 8k.... you are looking at minimum 30k worth of equipment just to actually deliver without any issues in realtime even now... with the next wave of tech in the next 2 years it may be down to 10k unless prices go up for parts (retail those 30k parts would be worth in the 100+k retail range due to the needed proprietary software and expertise as well as the law for economies of scale....) granted... it is possible to make cheaper 8k but it would be at some costs of performance... buffering... possible sound desynching and what not...
well theoretically i guess an epyc chip might be able to pull it off at around 10k cost now.... hmmm
8K requires ridiculous inputs and no interface will scale properly with it. I saw some Dell 8k monitor demos, you'd need a microscope to find the start button.
Chino Gambino Basically nothing is ready for 8k *yet
You forgot to add one thing to this - Viewing distance. The closer to the screen you are, the better a higher resolution is.
Its like everything looks good on my 50" 1080P TV if Im laying in bed playing a game. If I move up and sit on the edge of my bed, I can see the blurring or pixelated look of games. When I'm using my PC monitor (21:9 1440P) I dont get that same effect.
@@chinogambino9375 I'm sure there will be future updates allowing you to scale the windows interface to fit your display.
I wondered why my eyes are always hurting. I gotta stop staring at trees.
15:06 "anything filmed before Last year (2016) will never be in 4k because it wasn't filmed in 4k"
Yet here I am watching terminator 2 (1991) in 4k because it was filmed in 35mm and rescanned in 4k. 35mm has a maximum detail about double that of 4k.
actually lord of the rings was filmed using 8 k digitial cameras but the 8 k footate was deleted when they down scaled to just 4 K for archiving the raw footage , these guys woudl FILL hard disks per day with shooting footage , YES 8 K back in 2003 it was a thing
And how was that CGi? Did they bother to render it in 4K and re-edit the whole thing?
@@fisherdotogg The CGi scenes are about 720p in quality, way softer than the rest of the film.
And here I am watching Star Wars Episode 4 with an incredibly good 4K rescan quality, I'm amazed and this guy on youtube is just jealous of 4K tv owner and advocate his theme very well but come on, get an oled 4K tv and see if you can't see the difference brah
Wait what the heck is 35 mm and how does it have a maximum detail of 8k?
I can back to this video just to say "4k" is today.
I had dual 4k 60hz monitors for my PC, and after a while, I decided that higher resolution was pretty much pointless for me. I switched to 1440p 144hz monitors, and I'm MUCH happier with it.
I have dual 4K 144 Hz monitors which is kind of best of both worlds, but those might not have existed a year ago - they're still quite pricy now. My girlfriend still prefers 1440p 240 Hz.
Am I able to identify individual pixels on a 4K image? No.*
Am I able to identify a sharper picture from 1080p and 4K comparison? Certainly.
(*I think that's the whole idea)
This.
Same thing with framerate. But the argument that the people watching this probably aren't the average consumer definintely does have merit in my opinion. When you start looking at people in their fourties (average age where I live. not sure about USA) their vision is often just not as good as someone in their twenties or even younger.
Unless you sit close you won't see the difference. Some 4k TV's have higher picture quality but that's not because of the resulution.
For me, its something i dont notice nearly as much until i go back. Like, i have a 1440 144hz monitor for my pc, and i can *not* go back to 1080/720 30fps on my ps4. I imagine if i upgraded to a 4k monitor, i would notice the difference, but not as much as i would notice going back to the old monoitor.
Sarcasm I agree. Though I also agree that the jump from DVD to Blu Ray was waaay more impressive than the jump from 1080P to 4k. We are beginning to see the law of diminishing returns coming into effect with resolution.
The fact we only have 7 million cones in our eyes is irrelevant to pixel resolution or density of an image. Eyes don't see the world in a pixel-like way, and a vast amount of the processing is done by the brain, the amount of cones isn't really an issue. Eyes are not true analogs of digital cameras. The real world is made of continuous shapes and textures that are lit with infinite light rays. That's why games and movies at higher resolutions are perceptibly different, because they are rendered in pixels which the eye resolves as discontinuous and tell apart from at different resolutions. There is a point at which it becomes imperceptible but it's above 4K.
pretty much 7million p
Exactly. Acting like we can't see anything above 7 million pixels cause we only have 7 million cones is so fucking stupid and just shows this guy has no idea what he's talking about. He chucks round the fovea and the blind spot like it has anything to do with being able to see higher definitions.
Get someone to watch a 1080p video and then a 4k video on a large 4k TV, you can clearly see a difference and if you can't your eyes are fucked
the dot is a bad example, you would have changed your shirt between two shots, most people wouldn't have noticed it either. You absolutely see it when you know it's there. Anyway i had a blind test on my 55" 4K tv from 5m away and we could recognize what was 4K or FULL-HD. We don't see in 4K, you're right but as you said most of the details we see are in the center of our FOV, we don't focus on the whole screen at once, instead we pick multiple spots of interest of the image and we can see insane detail on this precise area, by doing that many times a second our brain merges all that in one single "image" and we think we see everything when in reality we miss a big majority of the screen. The problem is that the spots are quite impredictible from one person to another, therefore the movie can't just be the only areas we look at, it must have every single pixel in full detail even though we look randomly at only a few percents of them at a single time. That's why i think 4k is a good evolution of full hd. An uncompressed 4K movie would be 500gigs ok sure, but uncompressed 1080p is super heavy as well but that's why we compress it, to make it more accessible for our current tech. About 8K.... maybe i'll change my mind by seeing it who knows but i think at this scale we effectively can't see it, or not enough to be worth quadruple the amount of data. We should stay on 4K but maybe focus on better color accuracy, dynamic range or framerates for gaming. For having tried a few minuts a 4k 120Hz display with overwatch, i can say that the difference will be really worth it when it'll be more accessible.
What? The way cones work is exactly like a pixel analog, they absorb light and fire neuronal connections in the optic nerve to a proportional degree of light they are receiving, and once fired the cone has to return to initial state and won't flare for some time, which results in retinal persistence.
if you have 2 degrees of light of the same wavelength hitting the same cone it will flare with the average of the 2.
I don't know where you got this from, but its literally impossible to receive more discrete units of visual information than you have sensory cells to deliver.
@@pmangano There are similarities to how cameras and eyes receive light as you've pointed out. There are also differences (angle of incidence, colour perception and other aspects) which is why I stated that eyes are not direct analogs of digital cameras. The eyes are only half the story however, how the brain perceives and constructs images is the interesting and different part. Even though signals are sent in a discrete fashion images are constructed as continuous shapes as if lit by infinite rays. It's like a biological digital imager connected to a powerful AI. Leaving the latter out would leave you to some incorrect assumptions, which many pop-sci articles on the internet make.
120fps on a 60hz monitor does actually have an advantage, because the image shown on the screen is more recent and thus more accurate representation of the current situation. Linus did some videos on it.
Shut
@@tomlxyzshut
yeah, the information is 16ms old rather than 33ms old when its displayed. Theres a lot wrong with this vid. He mentions the fovea and its concentrated resolution and never deals with the fact the fovea moves across every point of the screen so avg resolution is meaningless. Every part of the screen has to be as detailed as the fovea.
Here's the optics math to prove the *average* person can see 4K:
In order to determine if a person can see 4K, we're going to use mathematics from astrophysics / photography. What we want to know is the maximum PPI (pixels per inch) that our eyes can resolve at a typical monitor / TV viewing distance. Then we'll compare that max PPI to the known PPI values of 4K monitors of various sizes.
---------------------------------------------
First we have to determine the viewing angle of our eyes. The equation for this is: alpha = 2 * arctan( d / 2f )
-- where d is the size of our film (cone width), and f is the focal distance to our lens.
We know these values for the eye. For the *average* human, the cone density in the fovea centralis is 147,000 cones / mm^2, or about 383 cones / mm (measured in one dimension.) We can therefore use 1 / 383 = mm / cone as our value of d.
f -- the focal distance from the fovea centralis to the lens is also well known: the average human value is 17.1 mm.
So the *average* human viewing angle is: alpha = 2 * arctan ( (1/383) / (2*17.1))
Plug this into a calculator (and ask for the result in arcseconds) and you get 31.49 arcseconds.
So the *average* human viewing angle is 31.49 arcseconds. Anything smaller than this at the observing distance will not be resolved.
---------------------------------------------
Now we'll find the average PPI that this corresponds to at the relevant viewing distance. Let's use two: 1) monitor viewing distance, 2) TV viewing distance.
I just measured the distance from my head to my monitor and got ~31 inches. Call this D(mon).
Let's assume a TV viewing distance of 6 feet, or ~72 inches. Call this D(tv).
--------------------------------------------
Our viewing target is a single pixel. Its size will depend on the PPI of the display. Its size is relatively straightforward in terms of PPI: the size of our pixel (d) is (1 / PPI). You can show this pretty easily: PPI is measured as pixels / inch. The inverse of this is inches / pixel. So (1 / PPI) is a measurement, in inches, of the size of our pixel.
In summary: d = (1 / PPI), so (1 / d) = PPI.
--------------------------------------------
With this information, its very easy to find the max PPI of our vision. First, let's set up our equation and then solve for (1 / d):
The equation is:
theta = 2 * arctan (d / 2D)
This is the same equation as before, with different variables. Theta is our viewing angle (we know this is 31.49 arcsec from our previous calculation.) D is either 31 inches or 72 inches for our purposes (although we can set it to anything), and (1 / d) is the PPI we want to solve for.
Manipulating the equation, step by step (for those with no background in math):
theta / 2 = arctan (d / 2D)
tan(theta / 2) = d / 2D
tan(theta / 2) / d = (1 / 2D)
(1 / d)= 1 / ( 2D * tan ( theta / 2 ) )
-------------------------------------------
Plugging in our two values:
1) Monitor viewing distance: (1 / d) = 1 / ( 2*31 * tan(31.49 arcsec / 2)) = 211.3 PPI.
2) TV viewing distance: (1 / d) = 1 / (2*72 * tan(31.49 arcsec / 2)) = 90.97 PPI.
These values represent the *maximum* PPI that the *average* human can resolve for monitors @ 31 inches and TVs @ 72 inches respectively. Values above these numbers are lost on the *average* human eye.
-------------------------------------------
Only one last thing to do: compare to the PPI of 4K monitors / TVs. You can easily look this stuff up:
21.5 - 27 inches seems like a reasonable size for a monitor.
At this size, the PPI of a 4K monitor is between 205 @ 21.5 inches, to 163.18 @ 27 inches.
50 - 70 inches seems reasonable size for a TV.
At this size, the PPI of a 4K tv is between 88.12 @ 50 inches, to 63.2 @ 70 inches.
-------------------------------------------
So can the average human eye see 4K?
Absolutely:
163 - 205 PPI is less than the max of 211 for monitors at 31 inches, and...
63.2 - 88.12 PPI is less than the max of 90.97 for TVs at 72 inches.
-------------------------------------------
This all assumes *average* vision. People with good vision have many times more cones in their fovea centralis. Science shows a large variance among individuals, and its not inconceivable for people with good vision to have up to 320,000 cones / mm^2 in their fovea centralis.
To save you the time, this is equal to 21.3 arcsecond angle of view, corresponding to a max PPI of:
Monitors @ 31 inches: 312 PPI
TVs @ 72 inches: 134.5 PPI
At 8K (!!!!) this means someone with very good vision could pick up the following monitors / TVs and get something out of them:
28.5 inch 8K monitor: 310 PPI
65 inch 8K TV: 135.5 PPI
-------------------------------------------
So where does Knowing Better's analysis go completely wrong?
He makes two downright terrible assumptions:
1) Every cone in the eye can only look at ONE pixel on the screen.
2) That pixel density and cone density don't matter.
Also -- you know --- he just doesn't use the appropriate optics math at all.
-------------------------------------------
Hopefully now: you know better.
*drops mic*
Damn... You are good at math :)
I just fucking came in my pants. Will you take my derivative, please?
Should have pinned this with a thank you comment on it.
;)
The one single thing you have failed to realize is that 720p, 1080p, 2160p, etc. are NOT optical resolutions like measured in telescopes, binoculars, microscopes, glasses, etc, which aid the human eye in seeing real objects. They are OBJECT resolutions which actually limit what the eye can see. The world is made of atoms, not pixels. As you get closer to a tv you can see the pixels better and better, meaning the overall picture gains less detail. In real life, as you get closer to a tree for example, you see MORE detail. The serrations on the leaves, the texture of the bark, are all more clear until you are so close your eye can’t focus on it anymore.
This is why more pixels on a screen will always result in a better and more realistic looking picture. There are almost infinitely more atoms in a space even compared to a 4K tv.
Probably THE single most intelligent rebuke to this jerk's video!!!
+
I guess he's never tried a VR headset where 1080p looks like utter crap because your eyes are so close to the screens.
this is my favorite comment. So true.
I think the overarching point is simple. At some point you're going to have more detail than your eye(s) can see no matter how close you stick your face to the screen.
"8K on the other hand? It will never happen"
3 years later, nvidia's already pushing 8K gaming with 3090 lol.
guess it's all about anti-aliasing from now on. people love extreme things, when it doesn't make any sense no more, and companies know that and use that perfectly. I'm saying this but the idea of 4K and 8K excites me as well. what can I say?
I mean, if you have a game in 8k in a 1440p or 4k monitor you probably won't be needing antialiasing
I mean all the people they showed in the announcement of the 3090 looked pretty amazed with 8K so maybe there is some kind of noticeable difference
That part of the video did NOT age well.
@@pieternaude1469 the entire video didnt age well
dude, if PS5 and new Xbox are going to run games 1080/60, it's already a huge step forward
7 years later, video didnt age well. 4k is noticeably better and allows for much larger TV due to better pixal per inch.
"Your eyes don't see in 4k"
* 2018-2020, everyone only sells 4k panels
"Your eyes don't see in 8k"
cries in poor
only a third of currently bought tvs are 4k, you lads are daft
Non 4k TVs aren't being produced anymore or shown in most of the big competitors stores anymore. At least not in my area
@@recoveryemail1046
they put 4k tvs on sale all the time now to entice folks into switching. i bought my 4k samsung back in 2018 for only 420 bucks. at the time it was the highest rated 4k tv in it's class for image quality (49"). smart TV too with tons of features built in , even upscales images to be 4k (albeit upscales on dvd quality stuff doesn't look good ). blue rays and 1080p content upscales beautifully on this tv.
4K is the way now and there is a visible quality difference. I shoot in 4K, 422, 10-Bit, 400Mbps, ALL-iNTRA on my GH5s, and use to shoot 1080p 100Mbps 8-bit on my GH2 camera. The difference is vast especially in colour depth. HD is still very good but I shoot in 4K and 4K is really picking up. 8K is here also but here I must say I don't see the difference with a normal size TV. I would consider 8K for video cropping when editing projects.
This video sounds to me like 'you don't need a 300 horsepower car because there are speed limits'
You don't need a smartphone because a flip phone can call and text and you already own a tablet so why not use that for everything else?
@@rileyesmay Tablet doesn't fit in Pocket
@@ahang6904 iPod touch then
@@nate2604 you cant do hard gaming on Ipod touch
@@nate2604 that's just a fucking old iphone with no sim so u can't really even call with it
There is absolutely a huge difference in 1080p and 2160p. It does depend on what size of screen. Larger the screen the more you notice the difference. I would agree though that at some point it will stop making sense to increase pixel density. Thats why manufacturers is going for color depth instead.
Johannes Snutt I remember reading a tech article stating they can make 8 and 16k now, but it's pointless due to cost, and nothing to watch on them anyway
I think you have to factor in DV/HDR, how close you are sitting to your set, and whether or not the source is true 4K to make a claim of "huge difference". Not to mention the quality of the set itself.
The term "4k" is ridiculous, it makes people think it's 4x the scanlines of 1080p when in reality they started counting *columns* instead of *rows.* But that said there is an obvious, noticable increase in quality when watching 2160p over 1080. This guy just doesn't care about quality. It's no surprise his video is poorly lit and looks bad.
I have a 75" 4k tv and at the distance of 10 or so feet I sit at I can still see a difference but a bit closer up and it's freaking night and day. It also must also depend on ones own eyes.
Im the same way as Hells Cat. Family just got a 70" Samsung 4K/HDR. For the size room we have, and that size of a TV, it makes 1080p look like shit! However the bigger difference maker on that TV for what I use if for which is my Xbox One X, is HDR. All that said, I would still rather keep my Xbox One X hooked up to my 22" 1080p Monitor, and pray that the game devs allowed me to use that extra processing power to run a game at 1080p 60fps. I can also see a big difference in games running on my Xbox One X compared to my One S, both connected to my 1080p Monitor due to the way that games were made with dynamic resolutions, and are now all downscaled from 4K instead of upscaled from 900p.
Turn out humans are bad at predicting things, who knew?
Depends on the content and how fast it is moving. There are times when a 4k image will be perceived as a tad sharper and crisper even though you cant distinguish the pixels.
viewing distance will affect your perception of sharpness..
With regard to frame rates: You mentioned it yourself: a person's visual perception increases in performance when they are more keyed up. An example of such a demographic are competitive twitch gamers, who are not only keyed up, but self-trained to notice details as a reflexive action to improve their competitiveness. You are correct about the motion blur, but remember that human vision is different from screens, in that it is _continuous_. There are no frame rates at all, because human vision does not chop up its input signal into individual frames, so it has the potential to pick up on any moment of time, regardless of its brevity, so long as the person is either 1: lucky, or 2: watching for it. So, while high framerates may be way over what is needed for passive entertainment (things you watch), active entertainment (things you interact with) would still benefit from those increases.
As for that one pixel, it's easy to miss. However, if you were to make a 1-pixel line that extended over a quarter of the screen, it would likely be noticed, and some might just wipe at that spot, thinking that a hair was stuck to it. Individual pixels may not be noticeable, but remember that the smaller the pixels, the sharper the edges, and edge detection is something the human eye is very good at, a point you made yourself in the self-driving cars episode, so sharper edges would be noticeable. And the fovea is always the center of vision in a sense that can be moved; even if there's a small portion of your visual area that has high resolution, the fact that the eyeball is both redundant (there are two of them) and mobile (it can twist about) with a mental frame buffer means that the size of the brain's "virtual screen" is much wider. Of course, this is pointless where passive entertainment is concerned, but active entertainment would still see benefits in higher resolution and framerates, particularly in the previously-mentioned keyed-up twitch gamer demographic.
This is one of the best answers in the comment thread.
@@antanaskiselis7919 I have to agree. @lampros Liontos well put! He has some merit in the video, in areas, but its a broad subject matter with a number of applications. What makes me sad is the number of people who nuh-uh and personally attack people for their statements. Considering the majority probably have no knowledge on the subject and are great examples of a Dunning-Kruger. We need to take a note from Lampros and educate and discuss rather than assault. It solves nothing and betters no one.
Another point is being an artist. 4K is amazing for artists, as we can better produce larger scale and more detailed works. And higher definitions allows for larger monitors, (Frankly, buying a 4K TV as a computer monitor has no real downsides, and is more functional) since it allows for more references on screen, more tools, and other useful things an artist would either have to sacrifice image space for, or constantly have to dig out any time they needed it
Also speaking as a person who did some pro gaming as a teenager (Try not to judge me too much)
We all turn off motion blur because of the chance it could obscure an important detail. In games like Rainbow Six, seeing the muzzle of a gun around a corner from 15 meters away can be the difference between winning and losing that round. If you have any blurring effects such as depth of field or motion blur, it will decrease the odds you notice.
Phenian Oliver smart kid, agree with you about the motion blur that obscures details and can effect gameplay because it makes it harder to spot something noticeable, great comments and don’t worry only morons stuck in the ‘olden days’ judge gamers even though it’s more wildly accepted now by adults
I don't know man. Soon as i upgraded to 4K I noticed an immediate difference. So noticeable that I sold 3 of my other 1080p display and got another 4K monitor
pretty much everything in this vid is wrong watch some other videos if u wanna know more about 4k
king Dione why would you get a 4k monitor. Dont you know that operating systems and vdeo games’s retina can’t see ina resolution that high? And if you play the game in a high resolution, the game itself would get dizzy!
@@AbdallahTeach what the hell are you saying
@@shanithan9573 Woosh
@@volen6031 welp. Im a dumbass
We don’t see in 4K… that is apt; but that’s still not enough for the intended effect. We’re not trying to match what our eyes can see, we are attempting to overmatch what our eyes can differentiate so it gives the impression that we’re looking at real life when we’re looking at the screen.
"99.9999% of germs, who cares right?"
Covid-19: *stares*
you're a moron
@@recoveryemail1046 lmao sure
Recovery Email it’s called a joke..
@@recoveryemail1046 you're a moron
@Morris Stokes sanitizers and wipes usually have alcohol or a similar chemical that kills chemically whether its a virus or bacteria, the way bleach would. so they also work for covid. different from stuff like antibiotics that target stuff only bacteria cells have
Laptop I have is 768p. I use a 1080p monitor for my business PC, and notice a difference. And for my gaming PC, I have a 4K monitor. I also notice A difference.
Hi
It's annoying to see fake Justin y.
+Factual Opinions how do you know I he's fake?
@@vaz270 the real Justin y has like 100k subs
Oh come on. Did you even watch the video? Listen to what was said? "The average consumer will not be able to see a difference on a TV with a 10 feet viewing distance." Of cause you can see a difference on your monitor from nose-length distance...
I like the volume of your voice. It’s very soothing.🙂
As a professional videographer, this video gets a lot wrong:
- Motion blur on cameras aren't caused by the frame rate, it's caused by the shutter speed. The general rule is to have your shutter double the frame rate. You can achieve this same "no motion blur" effect by shooting at 1/250 shutter on your camera.
- There's plenty of 4K in 2018, it's on RUclips and Netflix. I know Netflix Original Shows are required to be shot on 4K now, even though most people will be watching it in HD. Optical media is dying, it's now physical media vs. streaming. Hell, most computers don't even come with disc drivers anymore.
- 927GB for a 90 minute 4K movie is out of whack. You're correct if we shoot on an uncompressed RAW format, but 99.99% of people don't do that because of the huge space requirements. We have this thing called video compression that squashes down video to a file size more tolerable. My Sony a6500 4K camera can shoot 1 hour at the 60 Mbps setting on a 32GB memory card. Plus, newer codecs like H.265 are gonna squash down the file size even more and still provide clear quality.
✌️
williamdlc praise be. I'm an editor, and I thought the same shit you did while watching this.
Good points for video production. But not so useful for the average person watching TV at home from 10 feet away.
And H.265 does require quite a bit of processing power that the average person does not likely have on tap. Moreover, streaming 4k requires a wide pipeline over the old cable modem to boot, not the 1 to 3 MB/s that most people can afford.
Optical media isn't dead, it has WAY better A/V quality than streaming. Lossless 7.1 audio, HDR, and high bitrates. Hell, a UHD Blu-Ray can be 100GB! Can you stream that? No. Internet speeds and data caps are why Blu-Ray will live on.
Kyle, for the record, my very affordable (about £1 per day) connection has a solid 200 Megabits/second speed, and it's not the fastest around here. (Others have 'up to' 400 Mb/s.) I have no data cap whatsoever, apart from a Fair Use policy which will temporarily halve download speeds of the top 5% of users, during peak periods, after an hour. That 200 Mb/s means that I can download 25 Megabytes per second, or 1 GB every 40 seconds from a good source. (A Debian Linux DVD takes about 3 minutes to download from the Netherlands.) So 100GB could be downloaded in 67 minutes or so; less than the time it takes to view most movies. And this is England. Some countries have higher affordable internet connection speeds. Surely the U/S has better, and lower-cost, internet connections than the majority of countries?
you are aware this video was taken in 2017 and 1 year a lot has happened
2019 all I can buy is 4k tvs on Amazon.
Yeah this video didnt age as well as he would have hoped.
@SysPowerTools I guess he should have Known Better.
14:55 He literally says 4k will happen, but not because it makes sense.
4K is just doubled up 1080p. So you can still watch 1080p and the pixel count works 100% without sharpness loss 😉
He didn't say you couldn't buy one dipshit. Only that you would be wasting your money.
MKBHD: Are you challenging me mortal?
*laughs in 8k raw footage being edited on his 6k dual monitor setup*
And 90% of his audience watches it on a shitty 6 inch smartphone
@@Axyo0 exactly!
@@Axyo0 yeah, but hey, a lot of phones are now in 2k and have exceptional quality, as well as HDR support and high DPI
@@Axyo0 I feel attacked 😂😂😂
@@Axyo0 I watch his videos in 1440p usually. They look much better than almost everyone else's youtube videos
5 years later, I'm back for the 8k rant.
I don't think your examples are a good illustration of your main points here.
1 - Being able to distinguish an image flashed for 1 frame does not correlate with being able to perceive smoother frame rate. I'd be more convinced if there were a study that showed that people were not able to distinguish between movement on a 60hz display vs movement on a 120hz (or 240hz) display. I might not see an image that appeared for 1/144th of a second, but anybody with functioning eyes will be able to tell 144hz from 60hz.
2 - Similar to the above, being unable to detect a tiny dot on a piece of paper doesn't correlate to not perceiving the increased fidelity of 4k. The real question is can people tell the difference between 1080p and 4k, and the answer is a fairly resounding yes, though perhaps not to the degree of 60fps vs 144fps.
Not 100% sure if it applies but according to the Nyquist-Shannon sampling theorem for an optial expirience the frame rate should be 2x from what you can percieve.
considering framerate, linusTechTips made a collab with nvidia about different refresh rates (still haven't uploaded it at the time of writing), so I guess it might interest you.
And while I think your points are generally sound, your logic with photoreceptor cells is flawed.
While, yes, we have far fewer cone cells than the number of pixels on a 4K screen, it is precisely because they are condensed that higher resolutions matter. We don't see the entire screen at once, but anywhere the vision focuses may well be seeing up to the number of pixels in that area, depending on screen size and viewing distance.
Sure, we may not be able to discern single pixels, but we also can't discern single receptors; that doesn't mean they aren't carrying useful information.
This is exactly why the pixels per inch / viewing distance matters much much more then just "higher numbers = better viewing experience.". Glad you took the time to write out that reply man! :D
I think most retinal cells are ~"multiplexed"~ by a factor of 2-1 cell-to-neuron, as far as connectivity. Our expectations also directly impact scene perception. We're colorblind in the perifiry, but track movement better there. There are also cranial nerves that "intercept" the optic nerve and activate neck muscles based on info from the perifiry, which is also directly connected to the brain area that controls micro-saccades (eye twitch rythm). Bottom line is you can't stop at the retina when discussing perception. Vision /= perception
Without even realizing it, you just proved his entire point. "....we may not be able to discern single pixels," because the eye cannot cognitively perceive the difference between a line that's .01134 and one that's only .02269 from eight feet away on a 50 inch screen, but "that doesn't mean they aren't carrying useful information." Yes, you're RIGHT; they ARE carrying useful information. The kind of useful information that's employed for mental reprogramming and conditioning the viewers.
As with most engineering types, KB is making a failure of looking at raw numbers and ignoring dynamic effects. The flaw in the original CD standard was that the human ear, most of the time, can only hear 20kHz max... so 40kHz can capture all you can hear unless your range is higher than most.
The problem is, while we can only directly hear 20kHz, we can hear harmonics and transitions at much higher rates, which led true audiophiles to sense that CDs sounded "muddy"... because they were muted in those high kHz harmonics and transitions they were fully (or at least unconsciously) cognizant of. Hence the MP4 standard, which allows variable frequency rates as high as 320kHz or higher (320 is the highest I'm aware of seeing spec'd for anything. No idea if you would ever need anything that high, or even higher). But most MP4s are at 120kHz and higher, 3x the CD spec.
He's analyzing vision from that same purely mechanistic view, failing to grasp that our brains interact with information in a wide array of manners which extend into spaces he argues (incorrectly) that we are incapable of recognizing.
Note, not saying he's entirely wrong on all counts -- just that he's not making a valid analysis because of ignoring those dynamic effects and interactions with the brain, which can interpolate stuff into areas we are unconscious of. A cognitive biologist would probably need to analyze his observations for validity, likely even doing some experiments where none have been done before.
@@nickbrutanna9973 Except that in the only actual tests done on the subject, like with actual double blind testing, audiophiles haven't been able to tell the difference. Audiophiles are just as likely to be swayed by their bias as regular people. The "harmonics and transients" stuff is 100 percent outside of the human hearing range. You actually need to listen to the engineering types, because they're also scientifically literate. This particular video does mix some things incorrectly, but not as much as what you have done.
I know other people have said this video is crap (it is) but I'd like to clearly and scientifically debunk it.
You claim that the eye "sees" in 45-72fps, and therefore that you wouldn't see anything if it flashed for 1/72 of a second (~14ms). In fact, this rate refers to the temporal resolution of the eye: if a light flickered on for 1us (0.001ms) you would see it, but if it flickered twice 10us apart you would only see it as one flash - events that happen within one 'frame' are averaged together. That said, this 'framerate' is (at rest) not 45-72fps, it's 80-100, and thanks to aliasing effects analogous to what we see with mismatched resolutions it is possible to detect artifacts at 'framerates' of up to 1000Hz
You claim that, because 4K UHD is 8.3MP and the eye "sees" in 2MP (very sketchy way of getting there might I add) the resolution is pointless - but the visual information from the fovea (your centre of focus) has much higher 'pixel density' than elsewhere so if you look at any one part of the display that resolution becomes useful. The average eye has an angular resolution of around 1 arcminute (1/60 of a degree) - that is, it can differentiate two objects 1 arcminute apart. This means you can differentiate two pixels more than 0.73mm apart 8ft away - which at 1080p means displays larger than 60''.
This doesn't factor in the significant proportion of people with better than 20/20 vision, who would need better than 1080p to have a good experience even at 55in. (8K on TVs is pretty useless, as even most USAF pilots would need a >60'' display to see pixels at 4K - although I could see it coming to cinemas in the next few years.) Even with 20/20, you would want a higher resolution than this minimum to ensure that pixels are not just close enough that you can't see them, but also close enough that they smoothly blend like a real-life, analog scene would, avoiding aliasing.
There are a bunch of other things which don't detract from the main point of the video but are still worth mentioning:
- VR, AR and 3D are _still_ the next big thing, there's no point to the "Pepperidge Farm Remembers"
- people who brag about 150fps are people with good enough hardware to achieve 150fps - meaning they usually have money to spend on a higher refresh rate monitor. Not to mention that having higher framerates on lower refresh rate monitors still serves to reduce input lag.
- the 'Soap Opera Effect' is actually when TV hardware generates interpolated frames: 24fps content can be shown on a 120Hz screen either by repeating each frame 5 times, or by 'interpolating': trying to work out what should happen in between. Since we can tell the difference between 24fps and 120fps content, and we associate the 'feeling' of 24fps with cinematic content, this interpolation makes it feel more like TV, and less like cinematic content. It doesn't cause motion sickness (unless the interpolation algorithm is poor) it just makes the content feel less professional. It can usually be disabled so that frames are held for as long as necessary, so there is no downside to having a higher maximum refresh rate.
- there's nothing wrong with the naming conventions. The 'nice pattern' of 480, 720, 1080 continues to 2160, just as the pattern of SD (Standard Definition), HD (High Def.n), FHD (Full HD) continues to UHD (Ultra HD).
2K and 4K were defined in the original DCI spec of 2005 as resolutions of 2048×1080 (yes, 1080p ≈ 2K) and 4096×2160 respectively, and since the new UHD resolution of 3840×2160 is close to this it's often referred to as 4K UHD. People (including you, I might add) misuse the term 4K but it doesn't make a huge difference or mean the conventions themselves are at fault.
- "anything from before last year will never be in true 4k because it wasn't filmed in 4k" Sony were selling a 4K cinema projector back in 2004, so plenty of digital media since then is in 4K - not to mention the vast quantity of analog film reels which can be telecined to 4K or even 8K
- "The current generation of Blu-Ray disc ... holds 125GB of data" there aren't any that hold 125GB. There are 25, 50, 100, 128, 200, 300GB versions, so perhaps you meant 128GB. Except... there's a 300GB variant? In fact 4K Blu-Rays were released long before this video, using 50, 66 and 100GB capacities, with films like Spider-Man 2, Ender's Game, The LEGO Movie, Kingsman all released with the format.
- "a 90 minute 4K video is 477GB" depending on the encoding. Uncompressed, it could be 8TB. It could, in theory, be 1MB. It could, more usefully, be 30GB. As with all videos, you get better quality with higher bitrate. That could be a reduction in artifacts such as color banding, or it could be an increase in resolution as here.
- you suggest that streaming in 4K would result in buffering. As above, the bitrate is completely arbitrary, so you could stream average-quality 2160p video with lower bandwidth than high-quality 1080p. It's a resolution/framerate/bandwidth tradeoff, so it's just about how much you pay for internet.
- "and there are virtually no games in 4k right now" the vast majority of games have resolutions that scale with your monitor. The 980, released over a year before this video, can deliver solid framerates (on titles available at the time) at 4K, so the processing power (while large) is not impossible to achieve.
- "Steam players are the PC gaming elite". Where in hell did you get that? Steam is the world's largest game platform. Steam players are pretty representative of *all* PC gamers.
- "in 2016 only 25% of new TV sales were 4K"... representing a *huge* year-on-year increase which continues to this day.
- "it's going to take another 5-10 years for 4K to catch on to the mainstream" I would already consider it mainstream, with most new TVs being 4K and most digital content providers providing 4K content.
- "8k will never happen" It will appear in cinemas and on large high-end TVs
- great poll there. sample size of 15. super duper statistically significant.
Did you have to use google to find this?
Yo, 8k is already a thing Google it. It'll be mainstream sooner than u think
"8K will never happen" is just really stupid and arrogant. The human eye can see much more than 4K, especially in closer range.
Huy Đức Lê uh in closer range that means you’re eliminating the spacial area of the pixels and screen and make them “larger and the screen smaller”. . . (Aka shrinking your screen and magnifying the pixels to the focal point of your eyes). You STILL won’t tell a difference. It’s a dumb topic and it’s all based off of $$$$. When you see an atom with your bare eye please let me know lol.... take a math class beyond highschool before calling someone arrogant. Jeez.
+G. Smith by closer, i mean TV to armchair. Not 0.5 meter. That's totally enough to cover the entire screen
It's so interesting coming back to this video
You aren’t considering the fact that the extra pixels give an increase of perceived contrast. This actually makes things look more realistic even if you don’t actually see all the pixels. It’s actually noticeable for a lot of people.
Yep especially noticeable on highly textured surfaces and animal fur.
This. 4k gives a level of realism to textures and edging which has to be simulated otherwise by techniques like antialiasing which give inferior results and often result in visible artifacts.
4K is much more crisp and the colors more vibrant to my technologically layman eyes.
Roy Riley so true. And because you can move your head in so much smaller distances you will hit between 4K pixels. So higher res will always improve your experience
I guess I must be a fighter pilot high on every narcotic in existence because I can see the difference between 1080p and 4k and the difference between 60 and 120 or 144 frames and it's quite clearly obvious
It’s not true 4K, so you’re not seeing a difference of 1080hd and 4K, cuz it’s NOT 4K!
Everyone can. This is a really poorly researched video.
Slam......Do you always respond to other people that way?? Getting off the main subject here, but maybe you are looking at yourself in the mirror as you say all those childish things!
It's true. I run SVP whenever I watch video on my PC to increase the frame rate. If I don't have more than 60 fps, and I prefer 120, I don't want to watch. I probably won't buy anything less than 1080p either. I'm not a fighter pilot.
are you high ? : )
The title is correct at least.
We don't see in 4k, that's true
No. We don't. It has it's merits in giving off an optical illusion to making us FEEL like were REALLY seeing an increase in quality, but our eyes & brain have it's limits. It's called perceptive fantasia. I've even refuted this non-sense with link for a peer review study that explains how such things work between our eyes & a screen when it comes to resolution & FPS/Hz. I think anyone with any sense can concur that the ONLY thing that is going to TRUELY give you the most BADASS of all BADASS quality... is actually paying the price for the part upgrades inside of your RIG, not the monitor. The monitor helps, but it's really your computer/console superiority that is doing all of the real work. If I asked you to sacrifice either your 4k screen for a 1080, or your 1080 ti or even 2080 ti for a 720, then which one are you going to ultimately favor when it comes to performance & quality? ?.......
Like, I would like to think that anyone with any SENSE would say fuck 4K Lmao! !.. You can actually play HIGH ass quality games WITHOUT 4K. People are giving to most of the credit to the wrong shit, when they should really be giving the credit to the people who are responsible for making their evolutionized parts. Even the TYPE of screen is more important than the size or pixel count of the screen. Like, having a cheap low-quality built screen type with 4k wouldn't really do anyone much justice would it? ?... I've got to say. For a generation that likes tech so much, they sure as fuck don't argue over the right points... This 4K altercation has gotten REALLY old a LONG ass time ago. These people are driving me INSANE with this vacuous absurdity.
@@dentongaemz8359 Well, there is good reason to not play fps's in 4k. It's hard to hit 60fps in 4k, unless like me, you actually have a 2080ti and a 4k monitor. 1080ti gets like 30-40fps. 4k is great for story games with great visuals. Not so great for competitive gaming, you could play in 1440p at 144fps.
We dont, but we think we do. So it doesn't even matter because our brain says it is better. And we are a brains. So in the end, we basically see 4k even if we dont really see it.
I really like having a really big 4k monitor for strategy games. I always sit really close to it though so it makes a bit of a difference
"Anyway, the cones in your eye are ONLY red, green, OR blue." Humans have three different cones that respond to short, medium, and long wavelengths of light. This produces color roughly corresponding with blue, green, and red respectively. Humans are trichromats, there are other living creatures that have more (or less!) responsive cones, these are called tetrachromats or pentachromats.
"...there are only 2 millions 'pixels' in your eye." Humans can see far more color than that, about 5 times as much actually. The RGB color value for traditional displays is limited, it's called the color gamut. Most displays shoot for ~90-100% sRGB which does not cover the whole array your eyes can see. Further, RGB is not quite accurate to how we see, HSL is much closer. That's Hue, Saturation, and Lightness. Essentially it's color, intensity, and brightness (where rods actually matter). 4K TVs are slowly pushing the envelope here by increasing the bit depth of displays and achieving a broader color gamut.
Now...your whole argument about 4K being essentially worthless because it's too many pixels is simply bunk. Pixel density is based on screen size and pixel quantity and importance varies depending on the viewing distance and visual acumen of the person in front of it. It's hard to quantify an ideal density due to the difference in viewing distance, but there is a reason why an iPad with a retina display looks so sharp compared to your average 1080p TV at equal parts distance for their relative display sizes.
Mantis Shrimp FTW!
Yes, a pixel is the individual R/G/B elements of the pixel arrays.
Knowing Better said: "Now get this: there are 8.3 million pixels, and there are only 6 million cones in your eye, but that isn't the whole story because each pixel is 3-pixels in one: red, green, and blue. ... Anyway, the cones in your eye are only red, green, or blue and while the ratio isn't exactly a third if we match them up to make a pixel, that means there are only 2 million 'pixels' in your eye. Before you say anything, rods don't matter if cones are the pixel in your eye then rods are like the back-light."
This implies to me that that either: 1) We are only capable of seeing 2 million different colors, or 2) We are limited to only seeing up to 6 million pixels
It's not very clear what the point he was trying to make was as both of those points are flawed.
Me turning off motion blur: Ah, yes. Now I can see clearly and cleanly my enemies on a distance of 8kilometers
Shows a poll with 15 votes LOL
With a couple of hundred thousands subscribers and millions of views, he’s probably laughing at you idiots who think you’re the smart ones...
@@walterg74 Why's that? You think you can go full time just on that? LOL
@@walterg74 you know, being famous doesn't allways mean that you are right
@@walterg74 Walt If your perspective is that people must be correct based on subs and views, you most likely won't live into old age when you trust that life hack video that tells you to plastic wrap your food before you cook it... Do yourself a favor and think for yourself. This guy might have a credible channel but this video is absolute bull.
Toxic ^^
The editing on this was great. That being said, the video didn’t age that well tbh. I think this guy underestimated the progress of technology and how fast it really moves.
@@marvinmallette6795 his main point is that u can’t see 4K, which isn’t true (just as most his videos)
We bought a 4K TV about six months after this video was made and by the end of 2018, there is plenty of 4K content and I can see the difference.
You're lying to yourself, because you want to believe it too much.
@@Openreality no, there are plenty of videos debunking this guy, he does not know what he is talking about. You can see 4k.
You can definitely see a better blend of colors due to more pixels. Also look at a face on a 1080p television and look at that same face on a "4K" television with 4K content. You can see much more detail, filling in the face, on a 4K television (again with 4K content). The difference is blatantly clear.
5:45
@@Openreality You're blind if you can't see the difference between 1080p and 4k. Can you not see the difference between 30hz, 60hz or 120hz also?
Yes i do. Now stop making false claims. Human eye focus resolution is multiple times higher than the standard 4k resolutions and thats completely ignoring your peripheral vision.
Also of note: the so called "perfect vison" is not perfect. Its just a bare minimum standard considered acceptable before correction is needed. Majority of people in the world see significantly better than that standard, especially young people.
Yes, in fact investigations and research are revealing that the pixel resolution of the eye may be as high as 10000 x 10000 with frame rates approaching 1000/sec!!! I mean just do side by sides of 1080p and 2160 4K UHD... its mind blowing... which blows this guys pseudo science out of the water.
randomly found this trash video[haven't watched it yet]/ and happen to see a lot of but hurt comments and jokes.. well im her to say one of my eyes used to have inhuman vision,but only when i closed the other. when i did, i could see the eye chat[you know the one with the big R etc] .. the doctor person said i had 20/22 vision
yeah i see in 4k cause i just downloaded more ram for my eyes
@Thebrightestbrick88 you wanna a cookie?
@Thebrightestbrick88 Uhh.... Yes you can. Ever heard of downloadmoreram.com?
The brightest brick guy deleted his comment lol i can’t see it
Yea so funny man, are u 8 yrs old?
lol
This video aged like milk
Games dont "come in 4k". Also Bluray discs don't contain uncompressed 4K video like you seem to suggest. 4K Bluray discs have been around for a while. Its like this video was recorded in 2012 and you forgot to publish it.
for reals
Battlefield 1
Destiny 2
Divinity: Original Sin 2
Doom
FIFA 18
Fortnite
Forza Motorsport 7
Gears of War 4
Grand Theft Auto V
H1Z1
Halo Wars 2
Hellblade: Senua’s Sacrifice
Law Breakers
Middle-earth: Shadow of War
Nex Machina
No Man's Sky
Overwatch
PlayerUnknown’s Battlegrounds
Prey
Pro Evolution Soccer 2018
Project CARS 2
Resident Evil 7: Biohazard
Sniper Elite 4
Tekken 7
The Witcher 3: Wild Hunt
Wolfenstein II: The New Colossus
apparently all of these games do
Lol I think he’s thinking of consoles and how they have no options for 4K (except a few on the half gen consoles)
Stev Zarr m8.... m8.... we are talking about video games, not movies
@@AndrewBouchierUK Well no video game really "comes" in any resolution. One that is made in a smart way will scale to an infinite resolution if you have a powerful enough computer for it. The games you listed simply market the fact that they can (like pretty much every game made in the last 10 years) be rendered in 4k, should the user wish to do so.
The difference between a 4K Bluray and the 1080P Bluray that also comes in the box on a 65 inch screen is 100% noticeable. Watching a film that is actually shot and mastered in 4K is is miles better.
Connor Ross Right, and it's also important to mension that 4k BluRays are bitstarved (They usually use twice the bitrate for a resolution with 4 times that much pixels, while HEVC is just +20-50% more efficient than H264, not +100% like they promote it), so there's still room for improvements without even raising the resolution.
+1 I notice it immediately too. The thing is, the center of the fovea is very photoreceptor dense and for many people they can tell the difference between a high quality 4k moving image and that image downsampled to 1080p. It's just that it's the invididual spot you're focused on within a moment that you'll pick up, not the entire screen at once which is one of the major flaws (out of many) of this RUclips video. Most 4k Blurays also give us an increased color gamut, and that's also immediately noticiable by the majority.
35 mm film have higher resolution than 4k.
Not because of the extra pixels on the screen. Just because of the more bits.
@@dizzywow 100% both
This is gonna be like the “your eyes only see in 60 FPS” meme
Do you mean the 30 fps meme?
that meme is only 30 fps, scrubs can’t imagine over 35 fps let alone 60+
144hz
@thicc Spider most older games are capped. Almost ALL modern PC games are unlocked.
@lɐɯɹou ɐ Spider Most console games are capped.
This aged well
Higher framerates are better not because you can spot every frame but because it just feels smoother and input lag is lower, having said that I still stick with 60fps
It's almost like a higher refresh rate would be better at simulating reality, where for most intents and purposes time is continuous.
it feels smoother, because the interval tightens, and makes resyncing with the screen faster, because the interval is smaller. However, you could put effort into making steady frame interval, instead of pumping effort into higher frame rates. A pro gamer, could perform better on 45fps, if it ran with solid frame intervals, than on today's 120 fps, because he would be able to trust his timing and perception even more, and thus have more success intercepting movement, for shooting moving objects.
That is also why, a movie played on a solid medium does not feel studdery, but a game on pc, running at 24fps feels really bad. The brain will perceive the smooth motion way better, if the playback is solid.
Martin Panda
You know that locked 120fps are possible, right?
@Martin Panda: "A pro gamer, could perform better on 45fps, if it ran with solid frame intervals, than on today's 120 fps, because he would be able to trust his timing and perception even more, and this have more success intercepting movement, for shooting moving objects."
That's actually demonstrably false. There are plenty of games where you could pick a resolution 'low' enough for the game to run smoothly at 120fps basically all the time. Not only that, but in online gaming a lot also comes down to having a videogame programmed well with proper network code and fast enough servers. Fact is, smooth gameplay on a rock solid 45fps, versus a rock solid 120fps will *always* result in smoother play at 120fps. There's no debate there. If you'd knew pro gamers, you'd also know they prefer 120fps over anything lower.
Last but not least, by your 'logic', a rock solid 24fps would perform just as well as rock solid 45fps, which is simply not true.
You think it feels smoother. As long as your video card can render 60fps, and the rest of the computer can feed it 60fps, that's (mostly) what you see.
If you watch a DVD on your computer, is it blocky and laggy? That is the good way to demonstrate what your screen can give you.
We get upset when we're gaming, and in huge battles we see the blockiness. That isn't because the screen can't do it. It's something about the game. Any computer based fps display isn't necessarily showing you what has gone through the video card to your screen. It's what should have been rendered.
So sure, a complete system optimized for 120fps will look "better" than 60fps, only because it's satisfying all the demand for 60fps, from the game through the light emitted by your display. When 120fps degrades to 90fps, and all the way down to 60fps, it's still satisfying your 60fps need. When it degrades more, you'll see it.
A good system and video card satisfy that, without trying to bankrupt yourself trying to reach the 120fps mark.
We have a lot of technology now that depends on your visual acuity only being 60fps. Sometimes it's only 50fps. Like, some people can see lights flicker and get headaches when the local power grid is only 50hz. Not everyone, only a few. You still don't notice things like multiplexed segmented displays flickering at you at 60fps, but it (sometimes) shows up on video recordings.
A lot of affordable 120hz screens are really 60fps with a 120hz pulsing backlight. It lets them advertise "120hz", without admitting they're only 60hz. Most people won't ever figure it out, unless someone shows them the detailed spec sheet.
The whole point of more pixels and frame rate is to Not see the individual Frames/pixels but to see the overall in cojunction for smoother experience
Smoother experience is also dependent on your video processing and communication between hardware. and yes a higher framerate or Hz, does mean smoother. That is why 4k monitors struggle to reach a good response time and high refresh rate.
Exactly. You want a screen that is beyond the performance of the human eye. That way, what you see is indistinguishable from what you would see naturally. You won't detect any display artifacts from having visual acuity better than the screen you are looking at. There is more to what you see that simple pixel counts on a screen.
You forgot to mention one of the real reasons to get 4K. People often times upgrade the size of their TVs. Once the TV gets large enough, the pixels become 1) too large or 2) too spaced apart. This is obviously affected by the viewing distance to the TV, and like you said, the average consumer is probably not buy a 70+ in TV, but still and important fact. This effect is also noticeable with new phones, depending on the display resolution. When upgrading to a new generation phone, looking at the difference in the display, it can become obvious just how much you actually can see the pixels on lower DPI displays.
20 years ago 40" tvs were only a dream for the average consumer Things change.
yeah, one day our houses will be too small to fit a bigger tv
Yup. I love my 43'' 4k. Never see a pixel again. Lol
Darn it I went too big, can still see them on my 48". ;)
Obviously this is more in games though due to the huge contrast of the HUD, movies is harder to tell due to the colour transitions being softer.
This aged horribly
I think one point is missing which is pixel density. The higher resolutions look more crisp the larger the display. 8K might not make sense on a 60 inch TV, but might be more effective on a 6 foot wide home projector. Right not that’s not for the average consumer but I can see projectors becoming a popular format during the growth of 4K.
Yeah, sadly this video just scratched the surface. The 4k discussion is also about compression, screen size, pixel density, viewing distance, processing power etc. Things that can fill a whole video series
Pri yon Joni all I know is, whatever the pixel density Def Leppard uses on their screens on stage is out of this world...they are disgustingly clear
The density/pixels per inch/ppi was what I was thinking right away, especially smartphones. I remember laughing when I saw a 5" smartphone with 4k resolution. The ppi in that is just ridiculous, and seems like overkill, even when considering the closer viewing distance. But to be honest, I'm not completely sure. After I was taught in school "Humans can't see more than 24 fps", and "Humans can't hear the difference between mp3 and FLAC", I'm just super sceptical about all of these "Humans can't..." claims.
Yes... but no. Home projectors are not a thing because you need to be so far away from a screen that big to see it in it's entirety. Which means, that the pixel density is not remotely that important because you're sitting so far away. Which is the whole point of a theatre.
Most projectors in cinemas are 2K...
4k is definitely an improvement over 1080p and HDR exemplifies that by adding more vibrant color than ever before.
The vibrant colors are just the video anyone can take a 1080p video and increase the saturation
@@mrbanana4536 There's a thing called Rec.709 and Rec.2020. Basically It means there are more colors in nature than your regular TV can display.
@@GabrielAlves-lp1qr actually there are more theoretical colors than colors that you can see/find in nature. Color space has nothing to do with resolution. What are you talking about lol
@@alessandrodimilla8450 Yeah, color space has nothing to do with resolution but HDR was mentioned in the conversation, which implies Rec.2020.
@@alessandrodimilla8450 And who is talking about theoretical colors? Lol
Nonsense, I've made jumps from 30 to 60 and eventually to 144hz on monitors, while I have certainly noticed the Upgrade, EVERY SINGLE PERSON THAT STEPS FOOT IN MY ROOM IS BLOWN AWAY AT HOW SMOOTH AND CRISP THE MOVING IMAGES ARE. Enough of your pseudo-science, of course nobody's eye sees in frames and pixels, that doesn't mean we should just settle for a 1080p 60hz screen. Cut the tech-shaming
lulz cut the falling for the consumerism.
The real MacGyver
He’s right though. It is a ridiculously huge difference. The problem with the argument against is that the studies don’t actually test the perceived differences in fluidity and smoothness and instead will test individual frame recognition where they flash an image on a single frame and ask you to identify it. This isn’t an accurate test as you don’t see frame by frame, you see the sum total of the frames and then perceive motion out of it, the more frames, the smoother and more realistic it appears. The actual upper limits to frame-rate is over 1000fps but 150-200fps seems to be a nice sweet spot.
lol I can't find anything to run on my monitor at 120Hz, only managed minecraft, at a push!
When I was playing on my 144Hz Monitor for the first time, I was laughing hysterically because I was so amazed by it :D
When a friend of mine also upgraded, he said he couldn't see the difference in games but in these online 30fps vs 60fps vs 144fps slideshows he could.
Only person that I know that "can't see" 144Hz. weird
The real MacGyver just go look at a 4k 144hz screen yourself, "McGyver"...
Unbelievable how naive some people are :'D
It's kinda fun coming back to this video years later. Also a lot of what makes your vision good is your brain. Most of what happens in your eye doesn't matter. It's just about getting enough information for your brain to upscale it properly lol. Though if your eyes get worse your brain eventually can't compensate. So there's that.
"8k will never happen"
Samsung: hold my soju!
Its a marketing gimmick, so no it truly will never happen just like 4k. Even if it did we as human beings wouldn't be able to detect it again just like 4k.
@@Openreality you really believe this moron in the video? there is so many dislike for a reason, you dont need to believe everything you see on the internet you stupid moron
if you see a 1080p and a 4k display side by side you will be so surprised, and if you put a 4k and 8k screen side by side you will still be able to see the difference.
@@Openreality of course it will we will need it for vr. They are already making vr with 4k screen because otherwise you see pixels. They would even need more as 2x 8k screens would not even be enough.
Digio Yep the screen door effect
@@callmeosa9084 Just because you and so many others are butthurt doesn't make it less true...
I see in 40k.
I only see Blood for the Blood God.
I only see for The Emperor
I see dead people...
^ Didn't get the joke
Redgrave192 Damn right
its not about the resolution, its about PPI. so at a phone size, yes ur probably right, you cant see 4k. however, on say, a 84 inch display like the one I have, 1080p looks like trash if viewed from closer than 10-15 feet. Even from that far away it looks noticeably lacking in sharpness. THIS is the point of higher resolution.
I completely agree when people say you can't see 4k or something I say you cant see 144p.... if it's small enough
@HEAVY SYSTEMS, Inc. ok, so what about VR? Pretty sure DPI will still hold true, but what resolution should be the go to or industry standard for 4" dual screens 1" away from your face?
noticeable improvement? Yes
Looks like trash @ 1080p? Either you're lying, or you're a super privileged bitch who has set the bar waaay too high, or your TV is trash, or the media is trash. Because a good 1080p film should look just fine on a screen your size. Stop being a baby.
Your eye can still see while the eyelids are closed. Close your eye and put a flashlight up to it, you will see it.
Ow.
Actually most major films since 2012 have been filmed in 4k
BloodRider 14
Many even film in 8K now.
Jack Hutchison True, but usually they continue with 4K just because of how simple the Arri Alexa system is, at least compared to RED DCs offering. Still, movies are the only medium where 35 mm film continues to be used on a large scale, so that's something as well
Thank you for saving me the trouble of saying so. Even my "prosumer" NLE app works in 8K.
Steve Jordan
The only reason to shoot or edit in 8K is the same as it is for 4K: Less fidelity loss during capture and editing; however, unless you're producing content to be shown in a movie theater, you're likely going to be targeting it for either Blu-Ray or HD streaming. Even digital projectors in movie theaters are only DCI 4K (4096x2160).
25% market saturation in televisions is low, especially considering 90%+ of all video content is being consumed on a phone or tablet that's not even full HD.
This means that your 8K content - that you filmed on a $100,000 Red Weapon camera (taking up a half a petabyte of data for a 2-hour production) and then edited on a $100,000 editing bay with over $30,000 in hard drives alone - when uploaded to RUclips is being mostly watched at 720p anyway.
The problem of digital movies is not the format on which they were filmed, it's the resolution of the master used for post production.
A lot of movies shot in 4k actually have a master of 2K because, until very recently, it was a lot easier to manage in the editing, or vfx process, and also much cheaper.
So there's a lot of 4k movies that are in fact 2K masters stretched for 4k blu ray, even if they were shot in 4k in the first place ;)
Best exemples are Pacific Rim and Gravity, with 2K masters, that were stretched (in theatre the stretching was even worse because they showed these in IMAX^^). So if you are buying these 4K blu ray, you are only paying a better compression, but not a bigger image than standard Blu Ray :)
Not all movies are shot widescreen. Some movies are shot in what's known as "open matte". This is when they shoot on 4:3 film and then crop it to the widescreen aspect ratio in post. So in some cases, if you watch the fullscreen version of a movie, that's the one with the uncropped image.
"The Shining" (1980) is an example of this.
This is also why you sometimes saw the boom microphone when watching film on video or TV.
Also keep in mind that some movies (like Tokyo Drift) are shot in perfect square and cropped for both 4:3 and 16:9.
Daddy shaym finally caved into the pressure of Recommended
That's cool, Doc!
He didn't even mention the most important thing to consider when buying 4k Monitors/TVs: Pixels per Inch and Screen Size. I use a 4K 40 inch TV as a PC monitor because the PPI is 110 instead of my old 1080p 40 inch TV at 55 PPI. At the close viewing distance I had it at (maybe a little more than 1.5 feet), the pixels are very noticeable/ distracting when gaming at 1080p but not a problem at 4K. To get the same pixel density of 110 PPI at 1080p, the screen size has to be 20 inches LOL, not nearly as immersive.
Shit most gamers don't even want to use 20 inch monitors. 4k gaming even on 27 inch monitors is noticeable as fuck
@@Aragon1500 Agreed, I do wonder if the jump from 4K to 8K is noticeable, however
I remember enjoying this video when I first watched it. Now I see that the information in it is largely opinion being presented as undeniable and well researched fact, and often wrong. Makes me worried about the other videos on this channel.
yep just found this video in my recommended and its downright silly inaccurate at times.
@@XiahouJoe His predictions were inaccurate(because he underestimated the power of advertising and the gullibility of the average consumer), but what else do you think was inaccurate?
P.S. it should be noted that a lot of the information here can also be found in several Vsauce vids as well.
@@InsomniacPostman I should note that you tube keeps deleting my comments citing links showing hes wrong and people can see the difference of 4k vs HD example though search term "Can people see the difference between 4k and HD" first article
@@XiahouJoe I've had a 4K monitor for the last 7 years and, other than a reduction in motion blur, not a single 4K video or game has looked any different from a 2k one. Not to mention that most "4K" films are actually 2K upscaled to 4K so all you're getting is a lot of unnecessary pixel doubling.
@@InsomniacPostman that's great your anecdotal means jack and squat to study comparisons.
200 FPS on a 60HZ monitor actually feels smoother because the GPU has a more dense, and therefore more accurate, pool to fetch frames from. Also, the game has greater accuracy with gathering input data. Instead of it gathering the data within those 60 frames it can gather it within 200 frames, increasing the accuracy.
So no, it isn't just an illusion.
Goldfish_Vender i agree with you
Gamers are hilarious. They think they see things, but really they're just playing like children.
8:45 Have you ever wondered why virtually noone actually uses the programmed motion blur? Because it looks incredibly unnatural which creates even more motion sickness. It also adds a ton of input lag which makes the game feel unresponsive and creates even more motion sickness.
Philip Martin That's wrong. It looks unnatural because it blurs all the wrong things. Motion blur should only be applied to things that move with respect to your retina. The game assumes your eyes don't rotate and that mouse movement is your head movement. If your eyes follow an object moving across the screen you expect perfect sharpness, but you will see a blury mess.
The reason motion blur is tolerated in racing games and the like is just that the motion of the ground near the camera is hundreds of pixels per frame and we don't have 1000 Hz displays yet to do natural motion blur.
i find a very light use of motion blur enhances my immersion and ofc it should be done right (not blurring the background and only the object that moves, just so lightly that it is almost barely noticeable)
Modern motion blur doesn't introduce input lag. Motion blur used to average a number of frames which could introduce input lag, but modern motion blur uses the relative speed of objects and a post-processing shader. There is no lag at all, assuming the shader doesn't lower your frame rate.
uzimonkey really? I didn't know that... well I haven't used motion blur for years so...
Philip Martin Interesting stuff that thanks. I think all the talk of resolution (in the digital number-of-pixels sense), frame rate etc comes from just assuming the eyes and brain are analogous to a camera and computer... Which they're not in any way beyond a very simple metaphor. The eyes and the brain are more like an analog system than a digital one in most ways, but not that similar either. It's almost like now we're viewing the world through a technological lens that we overly compare biological systems to technological ones. In the brain or the eyes there isn't a single piece of metal bigger than a few atoms, there isn't a single switch or transistor (like we use them electronically), signals are carried only chemically, and much more.
Another example I keep thinking of is memory. Memory isn't some grid of cells somewhere - there is memory *at every level*, it's memory all the way down. "Memory" (past events leaving a mark and able to affect the future) is present from the tiny chemical level all the way up to the physical organisation of cells. An event induces changes in the concentrations of various ions, proteins that can change shape, changing levels of gene expression and chemicals released obviously stick around a while (depending on which chemical it could be from nanoseconds to hours or longer).
And that time difference in how long - say, a hormone release will have an effect for - shows another really fundamental important thing. In all computers and anything digital, time is made up of discreet sections or chunks, like the clock cycle of a processor or the frame rate of a camera or display, or the sample rate of some audio. A processor cannot do anything in a shorter time than that, and everything it does is made up of integer multiples of that time. It certainly isn't possible to have continuous time with that type of technology. Furthermore, the intensity of something has to be made of integer multiples of something, which again it doesn't have to be in organic systems.
I've barely scratched the surface, but think that's enough to show how different organic senses and processors are to digital ones.
"Wont be mainstream for another 5 years"
*A year later and every supermarket and electronics store that sells televisions has their shelves completely lined with 4K TVs*
4K is in and 8K is already on its way with a 32" monitor from Dell that launched at $5000 and has already been cut in half with some being sold at $2500. Granted that's still crazy expensive. The point is it's approaching faster than anyone realizes.
The importance of resolution is entirely dependent on how big the screen is, and how close your eyes are to the screen. The bigger the screen, or the closer your eyes, the smaller the pixels need to be for you to achieve an equally sharp image. Take VR headsets for example, their tiny screens have resolutions higher than 1080, but since your eyeballs are mashed right up against them, you can actually see individual pixels clearly. You'd NEED to achieve 8K resolutions in order to get an image as crisp as a typical HD television that you sit far away from.
Also, in response to the whole "remember when you thought VR was going to be a thing?" Yea, it was and still is a 'thing' you goob. Sure, it still hasn't broken out into massive mainstream appeal, but the technology is still being improved.
Your reply really depict the notion of "Bandwagon". Mainstream he meant was not just buying 4K TVs but also having 4K contents to the average consumer to consume without spending extra. Most consumer only have 1080i from their cable providers, heck some channels still provide 720p. So just because you go to Walmart and buy a 4K TV doesn't translate to you consuming 4K content for the average consumer. Also know the difference between Upscaling, Pixel shifting and Native 4K.
Personally I've never been able to detect the difference between any resolution higher than 1536p(the maximum resolution on my Sony Trinitron monitor), and at that resolution I personally think my monitor actually pumps out a superior image to the LCD due to its superior color depth. As far as framerate is concerned, I don't detect any difference above 70 hz or so. As for VR, that technology is "almost" mainstream now. The Samsung GearVR is probably going to be the device that makes VR mainstream - that or the Occulus Go.
Personally I think the only reason why 4k became a mainstream format for TVs is because it has a higher number, which marketing can always spin into an "its 4000% better than what you have now" argument. In reality, though, content delivery is still at 1080p and will likely remain that way for some time.
@@johnrickard8512 people with ps4 pro and xbox one x are justified in buying a 4k, and those that watch uhd movies, anyone using it for pure tv has been sold a lie, although they are future proofed but probably paid more for it.
Japan had 8k channels when i was there in 2011. That is how far we are behind here in Australia.
A lot has changed since this video released. 4k 120hz tvs are becoming more common but still gaming at 4k 120fps is not achievable for the average gamer. But we are getting closer to that each year. For now, I am happy with my 1440p 165hz monitor. More affordable and easier to run than 4k but still looks crisp
165hz moniter is pointless, though.
Anything above 144hz you wouldn't notice a difference. I understand why you got a 165hz moniter it's actually easier to get than a 120hz to 144hz monitors because they are raising the standard. However I feel bad for people who brought a 240hz+ than downgrading to a 144hz because they know there is no difference alot of youtubers done that.
@@TheAazah yeah unless you are a professional tournament gamer and need every frame and millisecond for quick response times, its not worth spending all that extra money for 240hz. I personally don’t notice a difference from 144-165hz and there is only a handful of games i play that max out the monitor. The biggest thing I did notice was the crispness of going from 1080p to 1440p. The LG monitor I got just so happened to be 165hz but i dont think i will actively look for anything more than 144hz in the future. I think we are a few years too soon before 4k 144hz is common for the average gamer
@@TheAazah This is untrue. The limit for human perception is closer to 1000Hz, although that is usually for flashes of light, and not necessarily framerate. I can notice a difference all the way to 360Hz so far (there are no panels above 360Hz that are available right now), though the main benefit is reduction of motion blur. It's less "seeing more detail/information" and more moving objects reach the point of zero motion blur, and it's all up to you after that.
I'd recommend watching OptimumTech's video on the XL2566K to see what I mean. For tracking heavy, fast movement games, it's a pretty stark difference.
*Addendum: I mean panels that can properly handle high refresh. Only the XL2566K and PG27AQN (both of which I own) can handle 360Hz correctly, all of the first-gen panels cannot. There is a massive difference in "fast" 240Hz and "fast" 360Hz panels vs. their first-gen counterparts to be wary of.