So for people saying I'm hating on 8K TVs because I can't afford one, you seriously need to watch this: ruclips.net/video/FVqZA9iVTJQ/видео.html It's an amazing example of how quality of TV technology is far more important than the quantity of pixels.
I have a 75" 4k q9fn TV and except the hdr effect I don't notice any difference on the 4k and 1080p bluray versions (yes I do see it when I go very close but not where I sit). Another things is if I compare my C8 55" oled to my 50" krp 500m plasma with 4k vs 1080p bluray, the plasma TV will look sharper in motion and same results when next to the Q9fn. So my conclusion is if you have a good 1080p source it's enough for home tvs. We also did the same test at my brother place with projector, there we do notice that 1080p is not enough vs the 4k version but that's on 100". So I think as you say it is the cinemas that will benefit from 8/16k. 4k on home projectors and imo 1080p on TVs 75" and smaller. Even good source 720p looks great on small 50" TVs. And I'm a nerd on picture and sound quality but I also refuse to be fooled by the TV industry trying to force useless over expensive tech on consumers lol. Great video btw
It is not about the resolution "as is" but the resolution "as depending also the size of the screen". It's like in print or like in the mobile phones: "effective resolution", which is measured in ppi (points per inch) or dpi (dots per inch). I work in print, and for the printing industry, 300dpi is the standard for press. If we print big size banners we can lower that, depending on the distance you view the banner. A banner on a big building is printed at 30dpi. So, 8K TV... depending on the screen size and the room. You might need it, you might see very clearly the difference between 1080p and 4 or 8K. :)
Most folks don’t even need 4k. I don't even use 1080p most of the time. I pipe 720p to my Sony WEGA and it looks gorgeous from our sectional, about 10 feet away.
@PeckyThePigeon it does work that way though. If you have a 8K video you can display sections of it on a 4K display without it getting Pixelated down to 1/4 of the original video without any messy interpolation.
@PeckyThePigeon It absolutely works that way. If you shoot 8K you can crop in pretty far on a scene and still come out with a 4K DI, which is what newer blockbusters are finished at.
Because it’s pretty much only in the US we say “let’s go to the movies/movie theater”. Everywhere else in the world they called it the cinema. (The building anyway)
@@jasonluk816 no..! I can notice the difference between them even on 27inch monitor...! There is noticeable difference 1080 on 27inch will pop the pixels which won't happend with 4k the distance is important factor
itachi uchiha that's why there's a recommended distance for the monitor from factories. Besides, if you are noticing your monitor's pixels, most of the reasons come from ppi( pixel per inch) If the ppi is over 200 you are good to go
100% correct! If your eyes don’t change, and the distance from the screen doesn’t change, then the “K” going up gives you the most benefit if the screen size is also increasing. If you can see dots with whatever you have, getting the same _size_ at a higher “K” will make it seem crisper, more detailed,and smoother. Making it bigger at the same time and you‘ll have about the same experience as before, only bigger. Probably meaning your eyes will be darting around the screen and missing detail elsewhere. This is the _exact_ same experience as sitting closer to a smaller screen without changing the “K” # of pixels. All these numbers don’t matter. Just distance, size and resolution, or, “K”.
@@CapFausto pixel density for things like VR doesn't matter per inch, but rather power degree of vision. The wider FOV you want the more pixels you want, size and distance don't matter, just their combination. For full "can't tell the difference" you'll want at least 8K per eye with 160 FOV (in the long run OFC) but even with current FOV 4K isn't perfect immersion
@GermanAnimeFans what you're not thinking of is the on board tech, specifically upscaling AIs. Using those we'll be able to turn 4K images to 8K or higher no problem- sure no additional information will actually be shown, but it'll be great for immersion!
I've heard somewhere that the resolution of our own eyes is somewhere around 8K. if that's true, depending on distance from the screen, going beyond 8K would be pointless. It would be good if we reach the limit, so we gamers stop focusing so much in graphics and give focus to technologies which aren't mature enough, like AI, for devs and hardware manufacturers to change their focus. Imagine a chip with built-in neural networks at a nano level, with trillions of nodes and connections. It would make youtube's AI trash in comparison.
I just waited 10 minutes for you to say “you cant tell the difference”. That was your scientific reason that you said you’ll explain at the end of the video
I mean if he has a 1440p amdroid phone he could just switch the resolutions and you can clearly see a difference between those on a small screen lol I mean I have a 55" 4k tv and realllllly can't enjoy 1080p yt videos bc they just dont have the resolution
@@prayforwilly RUclips also compresses their videos to high hell. Even comparing source 4k material and 4k videos on youtube, there is a drastic change in quality.
@@prayforwilly most people cant tell the difference between 1080 and above on smaller devices. I've seen people notice color contrast and frame lag more than resolution
At 47", you can definitely tell the difference in quality between 1080p and 4k resolution. But it also depends on the distance you are away from the TV. I think the minimum resolution for various TV sizes are as follow: Up to 32": At least 720p Up to 55": At least 1080p 55" and above: At least 4k
Yeah but anything is Retina if you are far enough away. 8k on TVs is a waste as chances are you will sit further back if you get a TV over a specific size. Say you sit 2 meters away, like I do, well 4k is retina until you get to about a 140 inch TV. I had a 32 inch 1080p and got a 55 inch 4k a few days ago and sitting 2 meters away I cant see a difference in quality but its big enough to see now and i would have noticed a difference if i bought a 55 inch 1080p instead of this 4k one. As size gets bigger, resolution should but 4k is already so dense it makes 8k almost completely pointless and anything above will be 100% pointless.
@@MrZodiac011 I started using 4K TVs as monitors ~7 years ago, because monitors were either not available or just insenely expensive with that resolution. Saying "8K doesn't make sense" is just denying every use case not covered by "Sitting on a couch, watching tv".
@ I mean, you can use a TV as a monitor and vice versa, but that's not what this is about. By talking specifically about 8k TVs and not 8k displays, it's automatically implied that we're talking about television (and console gaming). High resolution monitors and non-consumer displays/TVs definitely have their place.
Damn when you said 2030 i automatically thought 15 years from now and have to wait for a second to realize that it's only a decade away. Man time moves too fast.
@@yanceyboyz Yeah I cannot comment on that as I do not own a 4k screen. I wasnt denying that 4k is useful I was just saying that 1080p is definitely an upgrade from 720p
@@beepbleepboop i actually enjoy it, i need a rather big screen, and if it wasnt 4k it would not look as nice. for competetiv playing though, yeah its just Not nescessery - but nice to have
Mk2 eyeball pros: + able to discern more colors + more strength added to eye muscles reducing the chances of eye strain + ability to choose the eye color + able to zoom in and out + adjustable sensitivity to light giving a better vision at low level lights + added layer of water resistant materials + ability to discern smaller details not present on Mk1 eyeball + ability to add notes from brain enabling better focus on the brain + wider field of view allowing more vision with less head rotation + able to see and feel moving particles allowing brain to see through echolocation and wind catches: - very expensive to manufacture and put on - requires more energy to operate requiring more time to sleep and eat - chances of failure leading to blindness if not applied properly - requires high precision to apply and replace as materials used currently are very delicate - requires more maintainance - quickly heats up as cooling system is still under development and testing manufacturing: $1,000,000,000,000,000,000,000 plus tax and shipping (appliance and maintainance not included)
Yep. Bigger TV means you need a better resolution to be as clear as a smaller TV. Hard to believe my phone has as many pixels as my TV lol, but then again, my phone was much more expensive than my TV
Yeah distance and screen size are by far the most important factors. I'd say for 99% of consumer TVs there will be very little point in owning anything beyond 8K at most. 8K and over will only really be good for commercial applications like movies or large public screens. But that is exciting because once 8K becomes the standard, then there will more motivation to focus on improving the panels and the form factors of the TVs rather than increasing pixel count.
I'm sitting here watching football hungover on my couch 9 feet away from my 70 inch 1080p Visio tv from 2014 and I can't see any pixels and it's very clear. In fact the reason I haven't upgraded to 4k tv is because the only thing that looks better are the oled screens and I ain't paying 3k for for 4k 😂
Uh...no. Still overkill for 75". IMO, unless you have a living room big enough for a Jumbotron, 4K (ahem, UHD) is well good enough, even for a 120" (10-foot) screen.
Exactly! I got to the end and thought is he really just talking about pixel density.... lol. Him saying 4K looks better because of quote "colour".... bahahaha
His explanation was intended for consumers who have no idea what HD, FHD or 4K means. It wasn’t intended for you lol. He did a great job explaining it and the number of views on his video speak to quality of his explanation.
only on a 42 inch tv. from 10 feet away on a 50 inch the average person can already see pixels on a 4k tv. 46 inches is where they start to become visible. Anything over 95 inches from 10 feet away will also be visible even at 8k, once you go above 95 inches you need 16k for perfect fluidity. or if you sit closer to the screen, say 6 feet, 30 inches is the cutoff for needing 8k and 66 inches is the cutoff for 16k if you game on desktop with your computer monitor 3 feet away, even a 30 inch monitor you can able to tell the difference between 8k and 16k. 4k is only a perfect density for 16 inch screens or lower at that distance. e.g. a 15 inch laptop screen, 4k is ideal
Sorry to break it to you but human eye can see 576 million pixels. 32k resolution is a little less than that(530 millions pixels). Beyond 32k is pointless.
Beyond 4k is useless for how most consumers use standard TVs and monitors in their homes. It is not useless as a video format though. It's also a future proofing measure as there are many kinds of displays and such images will likely be able to take advantage of that level of pixel density in the future. It allows for zooming in with increased clarity, which could be useful for future security monitoring technology. On top of that, it's useful for very large screens. As in theater sized, not home sized. Larger screens require greater pixel density for the same level of clarity as a smaller screen. It also matters for HMD devices such as VR headsets. The screen is literally right in front of the user's face, and has lenses that alter and magnify the image so that it appears correctly to the viewer's eyes. A 4k screen still has the "screen door" effect where you can see individual pixels for devices like these. You need to get closer to 8k or beyond to eliminate it. The closer you are to a screen, the more pixel density matters. This would include AR virtual screens that a user might commonly walk right through in the future. Think someone wearing AR glasses in the future, and walking down a sidewalk with virtual ads displayed outside of stores. In addition to that, it will also be useful for virtual screens to have very high pixel density images. Things like VR and AR in the future will make use of higher pixel density in ways that users will likely be able to perceive. Not just because of the user's physical proximity to the screen, but also because of how virtual screens can be manipulated and used in virtual environments. Think of it as the same reason why 4k textures in games can still matter in a game that only displays in 1080p. A player looking at a 1080p screen can perceive the difference because a 4k texture retains more detail as the player gets "closer" to it in the virtual space. The same thing applies to virtual screens. Basically, beyond 4k is useless for standard consumer level home screens, but still matters as a future proofing technology that has some limited uses today, but will be increasingly important in future technology.
That’s largely a BS claim as there are no real consumer grade VR headsets running 4K screens, and the distance to the screen for 4K is literally right on the edge where the pixel detail is blurred together and indistinguishable. At most you would need 5K to make that disappear completely. For monitors, it will make sense for content creators that want to be able to edit many of their images in what is close to a 1:1 ratio. As for future proofing, do you honestly see a day when the average consumer has a TV larger than 160 inches? I do not and I don’t see a time when internet speeds to the home will average the speeds required to meet a quality stream in the next 5 or 6 years. So you’ll likely be buying a much cheaper 8K TV by the time everything else catches up, and even then you would have to be watching that TV at a distance of less than 5 feet to see a huge improvement on a 60” TV screen.
@@p3rrypm Incorrect, as that's mostly a straw man as it doesn't accurately reflect what I said. First of all, I specifically said it's viable as a format, not as commercially viable consumer hardware right now. That means 8k image capturing and files rather than desktop monitors or televisions. There are several consumer grade VR HMDs. That's literally what Oculus, WMR, and Vive are. There are also higher end VR HMDs that aren't really for consumers, such as Pimax and a few others. Some 4K HMDs having very little screen door is a result of a lack of sharpening. The image is literally softened and blurred slightly to reduce screen door. The holy grail is to get to that point without needing to soften the image so that fine detail will be retained. 5k is not enough t o accomplish that. I've kept up with modern VR development since the Oculus DK1, and have seen several professional statements to the effect of 8k is the goal for screens that can provide a decent fov without screen door. 8k VR screens exist now in the form of Pimax HMDs, and due to the wide fov of the display screen door is still visible even at that resolution. Basically, the more you stretch the image, the higher the resolution needed to eliminate screen door. It would also be useful for 180 or 360 degree video files for the same reason, because the image is stretched across a larger area, which lessens the image quality output significantly despite the resolution being high. Again, 8k 360 video files exists, and even when viewed through a high end HMD, the image quality is not as good as even 1080p on a standard screen. It's not just a limitation of the hardware, but of the format itself. That's not even getting into things like AR or retinal displays, both of which are existing technologies today. They aren't available on the consumer market and probably won't be for a while, but they do already exist as prototypes. There's also the 4k texture factor I mentioned. 4k textures are useful even in a 3D environment that is only displaying at 1080p. This is because the user can get closer in the virtual space and the texture will retain a higher level of detail. A good example of how this might be useful outside of a video game would be an 8k image or video file in a virtual screen in a 3D environment that the user can move around in. A virtual art gallery would be one example of a non-game use. 8k has a lot of uses for future technologies. It's a good standard of image quality that is relatively useless in standard consumer level displays because no one uses them with their nose touching the screen, but is still a viable format for recording and rendering because various future displays, both virtual and actual hardware, will likely be able to take advantage of the image resolution.
Just going to put this out there: 4K is great for work and writing documents, because on 1080 the letters are quite difficult to see. You can still make them out if the text is large enough, but having 4K means you don't have to squint or give a second thought to what a letter is unless it's a ridiculously small font.
@@Dizastermaster. No, it isn't, depending on which route you're going. if you're gaming (in the future, because 8k gaming right now is not exactly the best for modern PC games requiring DLSS or rendering in a resolution of 5k or lower) it'll make anti aliasing pretty pointless, and games will look amazing. (if you're wondering Yes you *PROBABLY* can get playable framerates on *OLDER* PC games with native 8k, however my focus is on modern games and I don't expect you to run Devil May Cry 5, CyberPunk2077 or Doom [for example] in Native 8k in 2021)
Yes but the jump between 4k and 8k isnt worth the performance hit and a better (refresh rate, pixel response time, colour accuracy) 4k monitor will always be a lot better value for your money than an 8k monitor
@@Ah-ec5ch uh... perhaps in 2021. but what about the future? cause yeah I see no one picking 8k over 5k, 4k, 1440p or heck 1080p, there are people who still game in 480p
TV as we know it will eventually go away, as other tech becomes more convenient. 8k (TV or otherwise) isn't useless, it just is for most people right now. That's what this guy doesn't make clear imo
@@blahuhm6782 i doubt TV will disappear for quite a long time, even if classic TV Broadcasts will end, Smart TVs exist. And 8k is probably useless on a home TV, as 4k is on a smartphone (unless you use it for VR)
@@francescocastaldo7469 The resolution must evolve, for both filmmakers in their process and for end-user viewing in multiple ways. You can't detach the hardware capability from its' output. 16K Cam, 16K output. End of line.
There is a use for higher resolutions. It allows us to make bigger screens and for the observer to sit closer to those screens without being able to distinguish individual pixels. So, if you have the same size screen, one in 1080 and one in 4k, there is a distance away from the screens that you can't tell the difference and you 4k becomes useless. However, if you get closer to the screen past that point the 1080p screen will start to lose clarity while the 4k still looks sharp. It's all about screen size and distance to the observer.
There is a huge difference in image quality between my 1080p and 4K on my 55inch TV. While "resolution" doesn't change, the bigger the Screen the larger the pixels (at the same resolution) which affects the clarity and acuity of the image. Not sure what this Guy is talking about.
i have the same thoughts. my 60 inch 4k Samsung looks great with good 4k content. The 4k feed from directv is way better than the hd feed on the same tv lol.
Netflix did some studies that showed HDR at 1080p was more satisfying to viewers than 4k, for typical 55 inch tvs. My 2c on this is that people now buy bigger tvs for smaller rooms and hence they can actually tell the difference between 4k and regular 1080p HD on 65 inch tvs and sometimes on 55 inch tvs. There's also the difference of image processing. The newer tvs make better choices about how to upscale a 720 or 1080 image to a 4k screen. I have 2 OLEDs from Sony and LG. The Sony and LG OLEDs have AI which uses a machine learning model to create the optimum up scaling. However no upscaling is perfect and if you look very carefully you could get artifacts in the image that indicate where maybe the choice wasn't ideal based on human vision. A 4k broadcast which was captured from 4k or higher material won't have upscaling artifacts. It should look more natural. Throw in HDR and the colour reproduction will be more realistic. It's a double win but not quite perfect. The latest OLEDs look so good because the match natural colour reproduction (sometimes pushed to be more vivid on the LG tbh) with 4k resolution and very high contrast possibilities. The black levels are much more realistic and the overall impression is more immersive and less like watching a screen. The machine learning even does object recognition and can make adjustments to neighbouring pixel contrast ratios to create more realistic depth of field. My missus thinks that with default settings the LG is more impressionistic whereas the Sony is more natural but you can tweak either to make the image more of less vivid.
There shouldn't be any difference regarding pixel count.. as the ideal viewing distance would be around 30° from the top of your nose to the lateral boarders of the display.. What you see as a better picture is likely to be the higher bandwidth and wider color gamut 'included' with newer standards and containers.. Not resolution.
bruh i can see the pixels with my naked eye from a certain distance on my 1080p tv, as for a 4k one, it's waaay more sharper, and dont start with the netflix, youtube content, those have bad bitrates, youtube fucks the bitrate of a video so much you can't even call it 4k anymore. You should try watching some raw footage or a 4k bluray, that's where you rly see the difference between 1080 and 2160...
There is no proof that Bill Gates said that. Not complaining, just saying that we, as spectators dont really know if Gates actually said that or not even though he himself denies that the quote is his.
It depends on the screen size and how far you view the screen from. People have been using 40-60 inch TVs as computer monitors when they were 1080p, with a much shorter distance than you are considering the average DPI of using 1080 as a monitor does not look great for reading normal text documents but still decent for video games and movies. For 40-60 inch size monitors 4K gets that resolution back to being somewhat better than standard monitor, but it is not till 8K they would be getting on par with cell phone pixel densities.
It was when 1080p first came out because of the tv sizes most consumers had at the time. It came out back when huge flat screen tv’s weren’t even around yet.
I think his statement stems from The example of a 47 inch tv. With that and say four yards viewing distance, there are surely more important aspects then pixeldensity. For a 70 inch screen with same distance, ofc the difference is huge.
I mean they both have a point. The increase in resolution only makes sense if screens keep getting bigger and the living rooms get smaller. Standard definition stretched over a 50" set 3' away is borderline gut wretching while a 14-18" set 6' away doesn't need to be HD.
I use a 48 inch OLED TV as a monitor sitting 3-4 feet from it and 1080p looks like garbage compared to 4K. 1080p looks fine when watching movies on it from 10 feet away though.
So thats why Apple say phone this size 720p is same as 1080p. Eye cant tell. Then Samsung release 1440p as gimmick cause phone too small to tell anyways. hmmmmm. I learn alot.
@@KhoiBoa When you look at your phone, you are looking at it from close distance. The smaller the distance the better you can see detail. So you cán tell the difference between 1080p and 1440p on lets say a 6 inch smartphone. If not, your eyes just aren't that good
Yeah this guy gets so many things wrong, misses other things, etc... bad video overall, like the dude years ago saying we can't see in 4k, what dumb dumbs...
@@blahuhm6782 I play on 1080p from a couch. Anti aliasing barely makes a difference. So the problem, your eyes can't notice much of a difference between 4k and 1080p, especially on smaller screen sizes. It becomes significant on bigger TV's which take up half of your walls height. Proved by the fact most gamers never upgraded to 4k. 1080p will be the standard for the terribly tiny screen sizes most gamers play on
@@meghanachauhan9380 Once 4K 60 is more affordable it will be the standard. Obviously in gaming, no matter how tiny the increase, better visual fidelity is ALWAYS helpful. It’s a must in any sort of competitive gaming, those small increases can give you the edge over an opponent. The problem is right now, I’m not gonna sacrifice either 60fps to play 4K. And I’m not gonna sacrifice my wallet to game at 4K 60.
Science: chances are you will not be able to tell the difference in quality and sharpness detail, reason is being there is millions and millions are packed in here. Wow so impressive...
@@BerkeBoz Not really science. You kind of just made his point. I worked at the UW eye lab in undergrad for a year. During that time we actually tested these sorts of side by side comparisons. We color tuned the monitors to 99% accuracy using the Adobe Color Space, set their brightness to be identical, and ensured that the pixel gaps for each monitor were as close to identical as possible (To prevent the screen door effect). At typical viewing distances we found people with 20/20 vision could quickly discern the difference between 1080p and 4k with greater than 80% accuracy. Now if your vision has deteriorated, you'll have problems. If your viewing distance is greater than average, you'll have problems, and probably most relevant, if you just don't care about image sharpness, and aren't looking for the differences, you may not notice them. Interestingly, while people sometimes don't notice increased pixel density, they are very sensitive to decreased pixel density. So someone who has been viewing 4k images for months, and didn't think the difference was large before, now suddenly subjected to 1080p video, will notice a stark decline in sharpness, this is double blind mind you. Realistically the sweet spot for most human vision tends to be at something between 1080p and 4k. Maximum visual acuity is around 35 arc seconds, with 20/20 vision falling between 35 arc seconds and 1 arc minute. That said, if the pixels of a device could occupy our entire field of view, the resolution required to ensure no artifacts and maximum use of our visual acuity is a little above 20k (200 degree field of view divided by 35 arc seconds is 20,571.43 pixels). Realistically, that's not how the human eye works. Only the fovea is high resolution, so if a TV were to actually simulate human vision, it wouldn't have the same resolution across the entire panel, we would only stare at the center of the screen, which would be 35 arc seconds resolution at whatever distance we were sitting, and the resolution would sharply drop off as you traveled to the peripheries of the screen. For the purpose of VR this is useful to know for foveated rendering, but the resolution of the entire screen must still be at maximum fovea resolution since the eye rotates independently of the head. In addition, a lot of our visual system is neurologically mitigated. This means for complex images that aren't faces we have an effectively lower resolution of viewing for most people because of how your brain processes images. For faces your brain creates a truer to life representation requiring higher fidelity, and for abstract single objects the resolution is maximized. A single pixel white line slowly rotates on a 4k screen will show obvious visual artifacts called aliasing, because our brain and visual system is capable of picking out the pixels and obvious artifacts of such a simple figure. Artifacts will in fact continue to be visible for people all the way up until 16k for people with "better than 20/20" vision, meaning the best vision available. For people exactly at the threshold of 20/20 vision 10k may be enough for them to cease to see artifacts, since their resolving capability is at an arc minute. For most visual patterns though, the brain simplifies, tending to cluster together objects to make colors, patterns, shapes, and identifiable figures. Representations are created that aren't really there for peripheries. You think you see in color and in detail things out of your peripheral vision, but you really don't, it's just your brain filling in the gaps for you. In any case... "generally" for most people, particularly as you get older and your vision declines, and if you have your TV sitting 10' away across a living room, 1080p is more than adequate... If you have good vision and you sit within 5' feet of your TV, this video becomes relatively inaccurate. If you are playing in VR and the screen is an inch from your eye, this video is totally irrelevant because it did in fact not explore any of the science of visual acuity at all, it just made general statements about resolution that have nothing to do with eyesight.
LifeAccordingToMayo well whatever point he’s getting at is misleading since he didn’t mention pixel density and tries to fabricate a discrepancy through “marketing.” As if TV manufacturers don’t disclose their actual pixel count and aspect ratios before you buy
Noah Mathis he didn't mention pixel density keeping in mind viewers like you would already understand what he's getting at. He mentioned the 47 inch 1080/4K TV vs the 110 inch 1080/4K TV to explain to the average consumer that images become softer as pixel density decreases. And also TV brands do mislead with their screen sizes as they include the bezels in the measurement
@@xKingston111 that's true. It's just like making price 59.99 dollars instead of 60 dollars for eg. No difference at all but still 59.99 feels less. Marketing is everywhere, and whatever he told i already knew that so he didn't say a penny's lie. Except 16k is a marketing hoax lmao.
7:53 as he said that I checked my YT settings and it was on 720 cause its on auto. When I switched to 4k, completely different person lol. 3840 is awfully close to 4000 and it makes perfect sense to be called 4k tv. Hell, what we call 2x4 stud in construction is actually 1.5x3.5 inches. I don't think anyone feels ripped off:) I own 48" 4k tv and first time I've noticed a huge difference in standard and 4k settings is when I played HZD . It's noticeable.
"The human eye will barely be able to tell the difference" - He didn't even explain why human eye won't be able to tell difference on 47" but would be able to tell on110"
Everybody every time there's a new standard resolution: "Let me explain the science about why this is pointless." Remember when people said your eyes can't notice the difference between 720 and 1080p and then 1080p and 4K? I do.
He said it is pointless on certain ranges of TV sizes. 10-15 years ago 32" was the norm... today is up to 55"... In the future, who knows?! Maybe a 100" screen or projector...
They said this when screen sizes of the average TV and desktop monitor were much smaller. The American broadcast networks went to 1080 in the 1998-99 period. The average LCD TV at that time in the United States was 24 inches. At that size, there's no significant difference between 720 and 1080 for average viewing distances for TVs. The consumer 4K standard was set in 2012. At that point, average TV size was 38 inches, and no significant difference between 1080 and 4K at that size for average TV viewing distances. Let's use a 42 inch screen, since that's a pretty standard size. A 42" 16x9 screen has actual dimensions of 36.5 x 20.5 inches, giving, at 1080 resolution, a pixel size of about 0.019 inches. If you're sitting 8 feet away from a TV of that size (not unreasonable), each pixel would have a visual angle of 0.0113 degrees. At 6' feet away, 0.0151 degrees. The problem is, a person with 20/20 vision has, at that distance, a visual acuity of only 0.0167 degrees. This means that a 1080 42" TV at 6 feet away is, essentially, at the limit of human resolution for that person. If you had a series of alternating fixed black and while lines on that TV screen each a single pixel wide, it would be _just_ detectable that it's a series of lines and not simply a solid gray screen. If you doubled the resolution (to basically 4K), each pixel would be, at 6" away on that same size screen, 0.0075 degrees, which is impossible for a human at that distance to make out. The series of alternating lines would just be flat gray. The only way you'd get the detail back would be to double the size of the lines...which is taking you back to 1080. And this is before taking into account image movement which TV usually involves, which would cause blurring and loss of such fine detail anyway. You can do this for other screen sizes. On a 60" TV, 4K is still not quite in the range of (normal) human visual acuity at 6 feet away. It's close (0.011 degrees), but back off to 8 feet and it drops to 0.0084, or about half of the necessary angle. Which means that on a 60" TV, if you're watching it 6" away you can perhaps just find it a little sharper watching 4K rather than 1080. At 8 feet away, there's literally zero difference. You might, as was pointed out, perceive it to be clearer but that's more likely a function of cameras and the TVs having things like better colour depth and filmed at faster frame rates than it is the increase in resolution. This is also why this doesn't apply to computer monitors. You sit closer to a monitor than a TV, and the images are more likely to be static at least part of the time which allows you to actually detect the higher resolution.
@Juan Perez, his comment made sense and yours didn't. There's a limit to the resolution of the eye. You can drive a vehicle without using your legs, but you can't view a screen without using your eyes. This is why the limitations of the eyes are relevant and the limitations of the legs are irrelevant.
@@seriouscat2231 there is not any limit on the resolution a human eye because the eye does not see in resolution. Its the same with frames per second, the eye does not see in frames.
First off, I love how he says there's a scientific explanation about it and then never talks about the so called scientific reason. Second, I don't if is just purely trolling or he really isn't able to see the difference between 1080p and 4k. And third, just by knowing a little bit of color theory and how it affects cinema, is very hilarious that he is saying that the thing companies are doing to "make it look better" it's just enhancing colors... I don't know about 4k vs 8k maybe there you can not see a difference, but 4k vs 1080!?
How close are you too the screen? In order to see 4k @ 47 inch TV properly, you sb back ~ 67 inches that's a 0.01 inch eye resolution. If you sit closer say 3 feet you can see a difference but you cannot see the whole screen easily. Now if your TV cannot process a 1080 signal that's another problem.
I have a 4k 32" display monitor and a 32 1080p TV and I don't see a difference between them when I use the monitor as a TV (so sitting in a distance). But the difference is very clear when I'm near it. It all depends on size and distance. For me it is very clear that as a TV there is absolutely no difference between 4k and 1080p on a 32" screen. Of course the images itself are different because the monitor is IPS 144hz and the TV is a VA panel (I believe it is 75hz but not sure) , but those are contrast, colors and refresh rate differences, not resolution.
Yup, 3/4 of the video is just presenting the resolutions and 1/4 is the tiny explanation, which is the most important part and he’s not elaborating on it. More importantly, the distance from the tv is arguably the most important aspect. I guess this video is for the complete beginners but even then, some thorough science could help them understand better.
Makes you wonder how many people don't know the difference between a row and a column. But I suspect that's why his explanation is so dumbed down, because there are people that don't.
Did i just hear 1080p will look good at 110 inches. What of 20 years ago people said theres no reason to ho to 720p. It doesn't take a genius to work out how pixel density and resolution work
You could also have added ppi to indicate the clarity or the detailing.. Eg phone screen 5.5inch with full hd screen has a ppi of 400+ But a 40 inch full hd screen has much lower ppi.. I hope you get the point.
4k is good and there are quite noticeable differences over 1080p on a laptop, but I'd like to have 8k even if the differences are minimum -- because going 1 generation beyond our eyes' ability means we're at the pointless endgame of the Megapixel Race. :)
Anything less than an 80 inch screen is pointless for 8K. If you don't have the room or the desire for a massive TV, 4K is more than sufficient, and overkill for many. HDR is the real game changer, not higher resolution; our eyes cannot resolve increased resolution past a certain point.
@@xpodx no real live its full hd that what you see full hd. 4k and above are higher then real life, the human eyes have limits so over 8k its to drop money to the garbage and anyway you need a super big size screen to see the full impact of high resolution and most people dont need 90 inch and above so not matter from where we look on it over 8k its useless...
@@xpodx science its not wrong just stop tell yourself stories because you are a punk of high resolution or a seller of this tv. Everything about it science alredy answer, you can find professors optic talking about that over 8k its bullshit and pointless becuse the human eyes have limits end of story....
Considering 4k is already affordable now, and 1080 is great for medium to small displays, you're 100% right. How bout some dynamic range? After all that's what'll really give the image life, our pixel counts are plenty enough.
@@AnywhereMiami no it's NOT "a fact". The Navy has published studies of pilots being able to see images flashed at a single frame way higher! Simple Google for the paper should do it. Truth will out
I have a 144hz monitor went to a game where i can control the fps cap, made the cap 144fps looked normal then went to 60fps and it felt sluggish and laggy. There is definitely a difference
As many point out and can’t make the distinction, FOOTAGE in 8k is quite amazing, however displays in 8k are useless. Honestly shooting in 8k to 16k and above gives tremendous advantages in the editing room when needing to do serious cropping and zooming (computer intensive granted). Just distinguish the 2 differences between footage and display.
When he said p stands for I was like yeah yeah I know but when he said progressive I was shocked what not pixel but progressive... Learned something new
Yea and 1080i(which most cable and satellite shows) the i means interlaced. Some people say 720p is better tyan 1080i. Reason tv shows 1080i and not 1080p is because of the bandwidth required. 1080p is what is on bluray
@@whydef269 no 1080i had to do with coming from the interlaced SD world and was to do with motion flow and to remove jitters. 24p physical film was double shuttered (i.e. the frame would flash twice on screen before advancing to the next frame), this was to help with flicker removal. The reason why 1080i was a thing for broadcast was mostly to do with sports, they usually broadcast in 720p or 1080i and interlacing mostly had to do with 1/2 of 60Hz electricity, i.e. 29.97fps. You youngsters have a lot to learn
@@beachcomberfilms8615 everything ive read as to why cable and satellite companies dont stream 1080p has said it involves too much bandwidth and costs much more. Yes there is movies and ppvs that show on 1080p but thats different since u have to purchase to watch it. I havent seen what you said as to why it originated so that might be true but i was speaking why they still havent moved on from 1080i. Most cable companies i mean.
@@peterpan408 RIP. I hate interlacing - it was a necessary evil for the old CRT TV days, thank God it's obsolete. Now if we could only get rid of the need for uneven frame rates in the U.S. (29.97, 23.976, etc.). and drop-frame timecode, editing would be a much cleaner task.
Sure...and it's true...marketing garbage. Just like those screen sizes, especially so with computer monitors. They shave a half inch or more and still upsize the number in the Ad. Dont get me started on aspect ratio...that 16x9 over 16x10...a rip off. Pawning off less screen real estate. Marketing Swine.
I honestly thought that I was going to learn something today, I learned nothing. He just ended up explaining extremely basic information. He never even got to the point as to why 8k TV would be pointless, as pixel count increases so does your PPI. This allows TV manufacturers to produce much larger tv dimensions at minimal fidelity cost.
fr, he really took 5 minutes to explain to me 3rd grade multiplication, and he made the point of saying the "p" in 1080p didn't stand for pixels, but didn't bring up 1080i or what the "p" actually means , and goes ahead and shows you it's the amount of pixels on one side of the screen, which makes it look like "p" means pixel
@@matthewdominic4336 basically he meant that 8K for Consumer TV's is the same as 1080p But depending on the size like cinema's It would be more useful for that
@@DISDATRONA If only I could agree to tht, it would make buying TV so much simpler and cheaper. The problem is tht I do see stark difference even between 1080p and 4k, and minor differences between 4k and 8k.
Agreed, this video is garbage. Unless you have a 20" tv or are sitting at the other side of the room, there's going to be a big difference between 1080 and 4k. What is this guy even talking about. Even 8K is highly noticeable if you have a large TV.
1080p is better if it got 2x the framerate. 30fps is a bad joke at any resolution, even 60 is low but a bigger improvement than 4k30 vs fhd30. We should aim for 120fps minimum, but samsung and lg want it different, so we are stuck with high res that is blurred down to low quality through motion blur.
@@pflaffik I agree on that for gaming. But recently I've been watching TV Shows and Movies on high refresh rates and it kind of sucks. It may be because we're used to the "cinematographic" 24fps, but I don't really like high refresh rates on TV content. They all look like soap operas.
My laptop has a 4k 15" 16:10 screen. I never watched videos on my old laptop because it just looked terrible with fullHD. I think 4k is good for 13" and above but is absolutely necessary for 15".
Totally different things We had to increase the size and speed of our computer memory to match all the development in other parts like the cpu and the gpu and the primary storage. But i don't think our eyes are gonna evolve fully to appreciate 16k or above or even 8k in our lifetime
@@fahis500 you're dam right they're not going to evolve😂 evolution takes place over thousands of generations, you cant evolve once you're alive. Only offspring have the chance to evolve and there's no survival need for our eyes to see in any more detail. Your eyesight might change over time but that will only be for the worse if you're staring at a screen all day
I love how every time someone releases a video regarding basic facts, it's bombarded by 400+ OPINIONS. Guys he didn't even mention the impracticality of 8k streaming where 6GB = 1minute of video. I have no problems purchasing something I have a need for, I simply don't need anything beyond 1080 at the time of this comment, as I don't have an overwhelming desire to turn an entire wall into a screen. If that's you're thing though good for you.
MrBaxtrax I’ve got a 75” 4K. Looks good. I can’t see anything beyond 16K, because no one would have room for a TV capable of illustrating the difference between 16K and 32K at the necessary size.
@@shadowhunter0 Wish they made 1080p HDR screens. I have no use for the 4k resolution since I have nothing that plays at 4k to even utilize all those pixels.
@Jalau 16K+ is definitely useful for anything above 40", just not at this moment. By the time 16K is mainstream, people will be using screens as replacement for windows with their own artwork/pictures being displayed. At that point, much like artwork, people will want to be able to stand close and admire the increase in picture quality. The way we use screens today is not the same as in the future.
Sorry but you're wrong,. The human eye can easily distinguish bewteen 1080p and 4k or even at 8k. At it's peak the human can see 576,000,000 mega pixels (better than 8k!). Maybe it's because you wear glasses and are visually challenged that you can't see the difference that others with healthy eye sight can. I had the privilege of seeing a display at a technology expo and I can tell you the difference even between 8k and 16k is phenomenal!
I don't think so, at least not like that. Most likely it was because TVs between 40-50" were the majority and everything much bigger like 70-80" were pretty expensive and therefore also a niche. And on 40-50" you usually don't really benefit from the resolution increase, as 1080p is already very sharp at that size with common viewing distances, but as UHD ("4K") TVs came with other technological advancements and also got more affordable, ofc they became the mainstream with 1080p TVs being mostly dead. Now the average TV size is most probably higher, but is it big enough to warrant the need for 8K (is there a proper name?) TVs? The size would need to double for it to be worthy, and we're not at 80-100" being the norm, it's probably more like 50-60". So, for most people 8K is really useless, it also doesn't seem to come with other advancements AFAIK. Is 8K resolution itself useless? No, for production you want to have higher resolutions than the consumer receives, to have headroom processing the source without it degrading the quality in the final output, like high bit-depth (24bit, 32bit) for audio/music, but that's a different topic.
Appreciate the explanation and breakdown of the numbering system regarding pixel count but you can’t say that there isn’t a noticeable difference between 1080 and 4K. Perhaps not in terms of casual television but as far as graphics are concerned while gaming, the difference is very obvious.
I think the cost is a driving factor. A few years ago people were telling me not to record video in 4K but I could tell the difference and glad I did so I am not re-doing all those videos as monitors get larger and 1080p looks worse. Higher Ks will eventually allow us to move our monitors to full wall and we get more desk space. Office meetings will be planning on walls. Higher K brings more options on larger screens, but the price matters.
Why is it that I always hear people say “any more than 4k is useless” and “anything more than 60fps is useless.” It’s complete BS. You can tell the difference between both
Because they are idiots . here is the math . 100 inch 8k panel is the same as Four 50 inch 4k screens . Both setups provide 32 million pixels . The base line resolutoin is 50 inch at 4k which is Exactly the same as 100 inch at 8k,... which is also the Exact same as 25 inch at 1920 x 1080 . This pixel size and pixel pitch\gap has been the base line now , for many years . If anyone wants super high resolution , then look at your 5 inch phone which might be close to 16k ( but it's only a 5 inch panel ) you will need a microsope to see the pixel gap.
@@VYDEOS2 120 fps yeah . However , 8k does have a purpose . 100 inch 4k screen is gonna have really large pixels . To make the 100 inch screen have the same pixel size as a 50 inch 4k screen requiares 100 inch at 8k . Larger screen size Drives higher resolution .
@240mains that's what pixel density is. 4k on 100 inch screen will look bad because there aren't a lot of pixels per square inch, since the pixels are bigger. However, 4K on 20 inch screen will sure as heck look better than 8K on 100 inch. They should start measuring screens by pixels per square inch rather than how many pixels. Apple Watch screens look nice even though they are only 360p technically. And yet your 80 inch 4K tv doesn't look so good. So basically, anything higher than 360p is useless on the apple watch screen, but not on a TV. Most people don't own a TV big enough to see a clear difference between 4k and 8k. Until everyone starts owning hundred inch TVs regularly, 4k and 8k are virtually the same. And then there's distance. If you view close up, you will see the pixels, and the image will look bad, but from further away, the difference between 4k and 8k is even less
I remember hearing this when 1080 went to 4k. Two TVs side by side can 100% see a difference. They also say 30 fps is all the human eye can see but any gamer will sit in front of a screen and tell you the difference between 60 and 140 fps without even knowing the actual fps
I find it strange that between 1080p and 4K they switched the side of the screen they are measuring. 1080p is 1080 pixels on the vertical side and 1920 pixels on the horizontal. 4K is 2160 pixels on the vertical side and 4096 on the horizontal. Yet, 4K get it's same from the amount of pixels along the horizontal side, whereas 1080p gets its name from the amount of pixels along the vertical side, which is considerably less. Using 1080p's logic, 4K is actually 2160p, but using 4K's logic, 1080p is actually 2K. It's almost as if someone wanted 4K to sound like a bigger jump from 1080p, so instead of using the same scale and showing it to improve twofold, they switched and made it seem to have improved fourfold. Unless I'm missing something idk
They do exist. Go to virtually any sports arena/field and look at the "jumbotrons". At that size, 8K COULD be discernable from 4K. On a normal home entertainment TV, at the average size, at the optimal viewing distance, the human eye will not be able to tell the difference. But he means home displays most likely. Which at 100" is too costly for most. That's either a theater room with a projector or a very large empty wall with a room large enough to accommodate. Again, most people are limited by either. So, for the most of us, 8K isn't realistic economically, and even if it were, it's still not realistic practically.
A 100 inches screen will still not need 8k or 16k I believe because the optimized viewing distance will be like higher and thus the apparent size of its pixels will be the same than a 4k TV 2 or 3 meters away.
If you wanted to make the case that the difference between 4K and 8K may not be very noticeable by the human eye, I think most of us would agree with you on that….but to say that the difference between 1080p and 4K isn’t noticeable, or that it is barely noticeable……I’d say that’s a bit much. Also, 8K and 16K aren’t useless. 8K and 16K makes TV’s over 100 inches possible, without the need to switch over to a projector.
So for people saying I'm hating on 8K TVs because I can't afford one, you seriously need to watch this: ruclips.net/video/FVqZA9iVTJQ/видео.html It's an amazing example of how quality of TV technology is far more important than the quantity of pixels.
I have a 75" 4k q9fn TV and except the hdr effect I don't notice any difference on the 4k and 1080p bluray versions (yes I do see it when I go very close but not where I sit). Another things is if I compare my C8 55" oled to my 50" krp 500m plasma with 4k vs 1080p bluray, the plasma TV will look sharper in motion and same results when next to the Q9fn. So my conclusion is if you have a good 1080p source it's enough for home tvs. We also did the same test at my brother place with projector, there we do notice that 1080p is not enough vs the 4k version but that's on 100". So I think as you say it is the cinemas that will benefit from 8/16k. 4k on home projectors and imo 1080p on TVs 75" and smaller. Even good source 720p looks great on small 50" TVs. And I'm a nerd on picture and sound quality but I also refuse to be fooled by the TV industry trying to force useless over expensive tech on consumers lol. Great video btw
I'm pretty sure I saw a video made by you years ago about 4K I may be wrong but I swear I remember something like that
It is not about the resolution "as is" but the resolution "as depending also the size of the screen".
It's like in print or like in the mobile phones: "effective resolution", which is measured in ppi (points per inch) or dpi (dots per inch). I work in print, and for the printing industry, 300dpi is the standard for press. If we print big size banners we can lower that, depending on the distance you view the banner. A banner on a big building is printed at 30dpi.
So, 8K TV... depending on the screen size and the room. You might need it, you might see very clearly the difference between 1080p and 4 or 8K. :)
@Jamal Did you read your own comments? How the F can 'oled only' do 4k?? What does organic diodes have to do with pixel count?
Most folks don’t even need 4k. I don't even use 1080p most of the time. I pipe 720p to my Sony WEGA and it looks gorgeous from our sectional, about 10 feet away.
8K is not useless. It makes 4K so much cheaper.
kuei12 🤣🤣
Because! Truth! .....and Marketing. 😆
Haa ha👉
Damn right! Phillips now doing cheap 4K TVs that do HDR10+ AND Dolby Vision!
Yeah, it’s big brain time
8k TV's are useless, sure, but shooting a video at 8k or 16k allows you to crop-in and reframe shots without losing any visible quality.
Exactly.. I was about to say the same thing.. thank you 👍
yeah... not his point though...
@PeckyThePigeon incorrect, you did not provide any explanation
@PeckyThePigeon it does work that way though. If you have a 8K video you can display sections of it on a 4K display without it getting Pixelated down to 1/4 of the original video without any messy interpolation.
@PeckyThePigeon It absolutely works that way. If you shoot 8K you can crop in pretty far on a scene and still come out with a 4K DI, which is what newer blockbusters are finished at.
My guide dog is perfectly happy with my HD TV.
severly underrated comment
@@nullstr-k6v Who said that?
@@moderman512 i bet your dog would taste well in my bat soup
@@channelname4331 why, why would you say that.
@@coreybircher8413 Dogs taste good apparently
“Cinema”? Haven’t heard that name in a long time...
Because it’s pretty much only in the US we say “let’s go to the movies/movie theater”. Everywhere else in the world they called it the cinema. (The building anyway)
@@As_A________Commenter they're talking about how quarantine made theaters close
@@As_A________Commenter many other places call it cinema
@@gunter6377 which is exactly what I said
wait until you heard "layar tancap"
I agree that 4k vs. 8k the difference in sharpness is hard to discern, but I wouldn’t say the same thing for 1080p vs. 4k.
Indeed, but only for cases like watching on massive TVs. We won't have problems on monitors that under 40"
@Jalau not quite a standard when I am living a small flat LUL, Under 40" is all fine
@@jasonluk816 no..! I can notice the difference between them even on 27inch monitor...! There is noticeable difference 1080 on 27inch will pop the pixels which won't happend with 4k the distance is important factor
itachi uchiha that's why there's a recommended distance for the monitor from factories. Besides, if you are noticing your monitor's pixels, most of the reasons come from ppi( pixel per inch) If the ppi is over 200 you are good to go
@@jasonluk816 i have ipad with 264ppi and it is garbage compare to my 532ppi smart phone..! There is a massive difference any one can notice
These things matters -
1. TV Screen size
2. TV Resolution
3. Viewing distance
4. Camera that shoots it.
Pixels density is important yes
5. Wearing or not glasses (if short sighted)
100% correct! If your eyes don’t change, and the distance from the screen doesn’t change, then the “K” going up gives you the most benefit if the screen size is also increasing. If you can see dots with whatever you have, getting the same _size_ at a higher “K” will make it seem crisper, more detailed,and smoother. Making it bigger at the same time and you‘ll have about the same experience as before, only bigger. Probably meaning your eyes will be darting around the screen and missing detail elsewhere. This is the _exact_ same experience as sitting closer to a smaller screen without changing the “K” # of pixels. All these numbers don’t matter. Just distance, size and resolution, or, “K”.
Eyesight quality
RUclips running at 360p: *imma pretend I didn’t see that*
@@sdrawkcabmiay Man I feel like I'm watching 4K whenever I switch from 144p to 360p when my internets a little better lmao
@@faimashuni9567 Meanwhile in Finland I refuse to watch videos under 1080p
@@tazka69 Yeah for some reason the internet here is SO expensive
@@tazka69 my heart is aching just reading those words
I watch at 4k lol
8k is important for vr when you’re half an inch away from the screen
Not reallly, the screens on vr are so small,the density is way Higher. Double 4k small screens on vr i doubt you can see any pixel
@GermanAnimeFans if we once have a 120h 8k vr headset, it will be legit just like real life, if not even better lmao
@@CapFausto pixel density for things like VR doesn't matter per inch, but rather power degree of vision. The wider FOV you want the more pixels you want, size and distance don't matter, just their combination. For full "can't tell the difference" you'll want at least 8K per eye with 160 FOV (in the long run OFC) but even with current FOV 4K isn't perfect immersion
@GermanAnimeFans what you're not thinking of is the on board tech, specifically upscaling AIs. Using those we'll be able to turn 4K images to 8K or higher no problem- sure no additional information will actually be shown, but it'll be great for immersion!
I've heard somewhere that the resolution of our own eyes is somewhere around 8K. if that's true, depending on distance from the screen, going beyond 8K would be pointless.
It would be good if we reach the limit, so we gamers stop focusing so much in graphics and give focus to technologies which aren't mature enough, like AI, for devs and hardware manufacturers to change their focus. Imagine a chip with built-in neural networks at a nano level, with trillions of nodes and connections. It would make youtube's AI trash in comparison.
I just waited 10 minutes for you to say “you cant tell the difference”. That was your scientific reason that you said you’ll explain at the end of the video
But he's holding a pen so he must know what he's talking about
I mean if he has a 1440p amdroid phone he could just switch the resolutions and you can clearly see a difference between those on a small screen lol
I mean I have a 55" 4k tv and realllllly can't enjoy 1080p yt videos bc they just dont have the resolution
@@prayforwilly RUclips also compresses their videos to high hell. Even comparing source 4k material and 4k videos on youtube, there is a drastic change in quality.
@@potatoe4221 I know but I meant the internal resolution.. so like you can see the difference in like the letters etc.
@@prayforwilly most people cant tell the difference between 1080 and above on smaller devices. I've seen people notice color contrast and frame lag more than resolution
The way you write "2" starting from the bottom is making me uncomfortable :-(
Plot twist: this whole video is recorded backwards, then inverted. So don't worry about the "2" ;)
The 8 is even worse 😩
Bro same. Wtf.
@@lekobiashvili945 unexpected tenet
@@fabianmichael9457 looooool he writes them like snowmen :')
Amazing how you managed to avoid the terms „pixel density“, „ppi“, or „dpi“ throughout this video!
@Joel Nelson 1080p and 4k will look like they have the same pixel density with the marker on the whiteboard
He just doesn't use the term but that's what he talks about towards the end with screen sizes.
Damn marketing terms!
dpi is for printing
@@wirotejitrungsri559 DPI and PPI are used interchangeably even though they're technically different, the point is still getting across.
At 47", you can definitely tell the difference in quality between 1080p and 4k resolution. But it also depends on the distance you are away from the TV. I think the minimum resolution for various TV sizes are as follow:
Up to 32": At least 720p
Up to 55": At least 1080p
55" and above: At least 4k
The most logical comment here
The ppi, vs viewing distance is what matters.
Take a shot every time he says “marketing garbage”
32 likes r the people that survived
You trying to kill me bro?😂
I Ran out of liquor and now I have to go buy more? This comment is marketing garbage.
4k ultra super amoled HDR+ platinum titanium uranium super wide. Mn tbe Jargons are half the reason I didn't buy a 4k tv
@blue yellow stfu
The larger the screen size, the more resolution you require to stay within a certain ppi range.
Yep. Im at 27" 215ppi
Yeah but anything is Retina if you are far enough away. 8k on TVs is a waste as chances are you will sit further back if you get a TV over a specific size. Say you sit 2 meters away, like I do, well 4k is retina until you get to about a 140 inch TV. I had a 32 inch 1080p and got a 55 inch 4k a few days ago and sitting 2 meters away I cant see a difference in quality but its big enough to see now and i would have noticed a difference if i bought a 55 inch 1080p instead of this 4k one. As size gets bigger, resolution should but 4k is already so dense it makes 8k almost completely pointless and anything above will be 100% pointless.
Thank you
@@MrZodiac011 I started using 4K TVs as monitors ~7 years ago, because monitors were either not available or just insenely expensive with that resolution. Saying "8K doesn't make sense" is just denying every use case not covered by "Sitting on a couch, watching tv".
@ I mean, you can use a TV as a monitor and vice versa, but that's not what this is about. By talking specifically about 8k TVs and not 8k displays, it's automatically implied that we're talking about television (and console gaming). High resolution monitors and non-consumer displays/TVs definitely have their place.
RUclipsrs in 2030: why beyond 16k is useless
I'm waiting for those videos while I watch them on my 4k screen.
Damn when you said 2030 i automatically thought 15 years from now and have to wait for a second to realize that it's only a decade away. Man time moves too fast.
RUclipsrs in 2040: why beyond 32k is useless
@lHarryl 2060: why TVs are useless
@Logic_Encrypted 2077: Why?!
16K exists
My internet: we dont do that here.
Heck my internet can’t even handle 720p60
I saw that 4k was an option to view the video. My poor 2014 Chromebook.
@@DrYesorno haha.
Same😌
Dr. Yesorno my one literally buffers at 360p😃
@@DrYesorno sometimes ram is the problem
Try to load 8k video on youtube
Even when loaded it will lag
Crazy how this video is trending after being released back in Feb 2019.
Mysteries of the RUclips algorithm
It'll soon go down as just another cringey misleading low-iq tech video, if it hasn't already...
I think YT is trying to recommend us New Year's Resolutions
I think it may be trending cause I saw a google article about the best items to buy after the holidays, it was TVs
2021 gang who up?
I remember people back in the days saying that 1080p was useless.
No you can phisically see the difference between 720 and 1080
@@joegunnigan7519 yes it's very much.
@@joegunnigan7519 you can physically see the difference between 4k and 1080p.....
@@yanceyboyz Yeah I cannot comment on that as I do not own a 4k screen. I wasnt denying that 4k is useful I was just saying that 1080p is definitely an upgrade from 720p
@@joegunnigan7519 you can indeed see a large difference.
"1080p or 4k?"
Me as an intellectual: 1440p
perfect resolution for a gaming monitor. playing on 4k is burning money.
@@beepbleepboop i actually enjoy it, i need a rather big screen, and if it wasnt 4k it would not look as nice. for competetiv playing though, yeah its just Not nescessery - but nice to have
For monitors the best resolution is 1440p because 4k is too much and 1080p is not enought
@@jesuscolon1373 what you mean. 1080p is still used by the majority of people and I use a 1080p 24 inch samsung monitor that is still amazing
Haha
At some point the human eye needs to be upgraded to keep up with all of this.
Mk2 eyeball
pros:
+ able to discern more colors
+ more strength added to eye muscles reducing the chances of eye strain
+ ability to choose the eye color
+ able to zoom in and out
+ adjustable sensitivity to light giving a better vision at low level lights
+ added layer of water resistant materials
+ ability to discern smaller details not present on Mk1 eyeball
+ ability to add notes from brain enabling better focus on the brain
+ wider field of view allowing more vision with less head rotation
+ able to see and feel moving particles allowing brain to see through echolocation and wind
catches:
- very expensive to manufacture and put on
- requires more energy to operate requiring more time to sleep and eat
- chances of failure leading to blindness if not applied properly
- requires high precision to apply and replace as materials used currently are very delicate
- requires more maintainance
- quickly heats up as cooling system is still under development and testing
manufacturing: $1,000,000,000,000,000,000,000 plus tax and shipping (appliance and maintainance not included)
Wym are eyes are complex
@@nitsu2947 Cons: vulnerable to optical reboot quickhacks
My eye sunglasses do that
This whole video could be summed up in two words - "pixel density".
This guy is a moron
Like Ramayan, " Ravna stole Rama,s wife and Rama attacked and killed Ravana"
@@GamingRealRacing3 how?
The video literally talks about how it's not pixel density, but pixel count
Visual Acuity
I disagree that 1080 looks the same as 4k. I get hypnotized when I stare at a 4k tv in the store it almost looks like Your looking out of a window.
8k is much better looking video than 4k. Just take a peek.
i have a 60 inch Samsung 4k tv. When i had directv the 4k looked so much better than the hd on the same tv.
@@borisfrog5282 i was watching a Samsung QLED 8K...it was like it's happening in front of my eyes
@@thabanglehetla6073 My Samsung QLED 4k is the same 🤔 like UNBELIEVABLY AMAZING
Did you not listen to the video? moron 😂
was afraid he wont mention distance, that thing actually matters the most
Yep. Bigger TV means you need a better resolution to be as clear as a smaller TV. Hard to believe my phone has as many pixels as my TV lol, but then again, my phone was much more expensive than my TV
@@josiahm6690 which phone did you own ?
@@christianc.1632 note 20 ultra
Yeah distance and screen size are by far the most important factors. I'd say for 99% of consumer TVs there will be very little point in owning anything beyond 8K at most. 8K and over will only really be good for commercial applications like movies or large public screens.
But that is exciting because once 8K becomes the standard, then there will more motivation to focus on improving the panels and the form factors of the TVs rather than increasing pixel count.
I'm sitting here watching football hungover on my couch 9 feet away from my 70 inch 1080p Visio tv from 2014 and I can't see any pixels and it's very clear. In fact the reason I haven't upgraded to 4k tv is because the only thing that looks better are the oled screens and I ain't paying 3k for for 4k 😂
hold on a second, this maniac starts writing his "2"s from the bottom?!!?
IN WHICH SCHOOL/KINDERGARDEN THAT BRUH GREW?!
@g@m3 Or how i draw 6
or б (russian b)
Why is nobody else talking about this?!?
*Person* does something different to another person
*Other person* Gasp! “You Maniac”
I know right. It was bothering me too
8k will make sense at a 75" though.
..
Thats a little too big for a small room. Tvs are becoming too big to keep.
Uh...no. Still overkill for 75". IMO, unless you have a living room big enough for a Jumbotron, 4K (ahem, UHD) is well good enough, even for a 120" (10-foot) screen.
If you're using it as a monitor, I'm inclined to agree.
@@ebinrock you clearly don't have or haven't experienced a big 4K TV.
"Marketing garbage" is one of my favorite terms as an engineer. So glad it's widely used.
Yeah, but it’s the marketing garbage that sells stuff.
How to stretch the explanation of a PPI in almost 10 minutes.
This.
Exactly! I got to the end and thought is he really just talking about pixel density.... lol. Him saying 4K looks better because of quote "colour".... bahahaha
He is a Tech youtuber as he said...
Except he didn’t actually talk about PPI, or more importantly the maximum pixel density discernible by the eye at a given distance.
His explanation was intended for consumers who have no idea what HD, FHD or 4K means. It wasn’t intended for you lol. He did a great job explaining it and the number of views on his video speak to quality of his explanation.
I'm so excited for 4K to become the standard in gaming, since beyond that point you really won't be able to see the pixels.
only on a 42 inch tv. from 10 feet away on a 50 inch the average person can already see pixels on a 4k tv. 46 inches is where they start to become visible. Anything over 95 inches from 10 feet away will also be visible even at 8k, once you go above 95 inches you need 16k for perfect fluidity. or if you sit closer to the screen, say 6 feet, 30 inches is the cutoff for needing 8k and 66 inches is the cutoff for 16k
if you game on desktop with your computer monitor 3 feet away, even a 30 inch monitor you can able to tell the difference between 8k and 16k. 4k is only a perfect density for 16 inch screens or lower at that distance. e.g. a 15 inch laptop screen, 4k is ideal
I can play sea of thieves in 4k 60fps on my 27" monitor and it looks AMAZING. However, most games won't hit 60fps at 4k with my 2070 super
I still gaming at 1080p 75fps
Sorry to break it to you but human eye can see 576 million pixels. 32k resolution is a little less than that(530 millions pixels). Beyond 32k is pointless.
I can’t even tell the difference between my 65 inch 4K OLED and my original game boy screen. It’s all marketing garbage!
that nearly killed me!
then Your EYES are the real Garbage!!
😂🤣
Should've gone to specsavers
😂
Beyond 4k is useless for how most consumers use standard TVs and monitors in their homes.
It is not useless as a video format though. It's also a future proofing measure as there are many kinds of displays and such images will likely be able to take advantage of that level of pixel density in the future.
It allows for zooming in with increased clarity, which could be useful for future security monitoring technology.
On top of that, it's useful for very large screens. As in theater sized, not home sized. Larger screens require greater pixel density for the same level of clarity as a smaller screen.
It also matters for HMD devices such as VR headsets. The screen is literally right in front of the user's face, and has lenses that alter and magnify the image so that it appears correctly to the viewer's eyes. A 4k screen still has the "screen door" effect where you can see individual pixels for devices like these. You need to get closer to 8k or beyond to eliminate it.
The closer you are to a screen, the more pixel density matters. This would include AR virtual screens that a user might commonly walk right through in the future. Think someone wearing AR glasses in the future, and walking down a sidewalk with virtual ads displayed outside of stores.
In addition to that, it will also be useful for virtual screens to have very high pixel density images. Things like VR and AR in the future will make use of higher pixel density in ways that users will likely be able to perceive. Not just because of the user's physical proximity to the screen, but also because of how virtual screens can be manipulated and used in virtual environments.
Think of it as the same reason why 4k textures in games can still matter in a game that only displays in 1080p. A player looking at a 1080p screen can perceive the difference because a 4k texture retains more detail as the player gets "closer" to it in the virtual space. The same thing applies to virtual screens.
Basically, beyond 4k is useless for standard consumer level home screens, but still matters as a future proofing technology that has some limited uses today, but will be increasingly important in future technology.
Thank you for your valuable information about this. You are absolutely right 👍
That’s largely a BS claim as there are no real consumer grade VR headsets running 4K screens, and the distance to the screen for 4K is literally right on the edge where the pixel detail is blurred together and indistinguishable. At most you would need 5K to make that disappear completely.
For monitors, it will make sense for content creators that want to be able to edit many of their images in what is close to a 1:1 ratio.
As for future proofing, do you honestly see a day when the average consumer has a TV larger than 160 inches? I do not and I don’t see a time when internet speeds to the home will average the speeds required to meet a quality stream in the next 5 or 6 years. So you’ll likely be buying a much cheaper 8K TV by the time everything else catches up, and even then you would have to be watching that TV at a distance of less than 5 feet to see a huge improvement on a 60” TV screen.
@@p3rrypm Incorrect, as that's mostly a straw man as it doesn't accurately reflect what I said.
First of all, I specifically said it's viable as a format, not as commercially viable consumer hardware right now. That means 8k image capturing and files rather than desktop monitors or televisions.
There are several consumer grade VR HMDs. That's literally what Oculus, WMR, and Vive are.
There are also higher end VR HMDs that aren't really for consumers, such as Pimax and a few others.
Some 4K HMDs having very little screen door is a result of a lack of sharpening. The image is literally softened and blurred slightly to reduce screen door.
The holy grail is to get to that point without needing to soften the image so that fine detail will be retained. 5k is not enough t o accomplish that.
I've kept up with modern VR development since the Oculus DK1, and have seen several professional statements to the effect of 8k is the goal for screens that can provide a decent fov without screen door.
8k VR screens exist now in the form of Pimax HMDs, and due to the wide fov of the display screen door is still visible even at that resolution.
Basically, the more you stretch the image, the higher the resolution needed to eliminate screen door.
It would also be useful for 180 or 360 degree video files for the same reason, because the image is stretched across a larger area, which lessens the image quality output significantly despite the resolution being high.
Again, 8k 360 video files exists, and even when viewed through a high end HMD, the image quality is not as good as even 1080p on a standard screen.
It's not just a limitation of the hardware, but of the format itself.
That's not even getting into things like AR or retinal displays, both of which are existing technologies today.
They aren't available on the consumer market and probably won't be for a while, but they do already exist as prototypes.
There's also the 4k texture factor I mentioned. 4k textures are useful even in a 3D environment that is only displaying at 1080p. This is because the user can get closer in the virtual space and the texture will retain a higher level of detail.
A good example of how this might be useful outside of a video game would be an 8k image or video file in a virtual screen in a 3D environment that the user can move around in. A virtual art gallery would be one example of a non-game use.
8k has a lot of uses for future technologies. It's a good standard of image quality that is relatively useless in standard consumer level displays because no one uses them with their nose touching the screen, but is still a viable format for recording and rendering because various future displays, both virtual and actual hardware, will likely be able to take advantage of the image resolution.
@@p3rrypmit is not about 8K tv . It is the 8k virtual screen that will a quantum leam ( high resolution virtual screen)
I completely agree with you. Especially on VR devices. in a couple dozen years, we're gonna need 64k on those bad boys.
Saying 4k and 1080p are the same at 47" is like saying 1440p and 720 is the same at 24".
yessssss
Depends from what distance u will watch . If this will be monitor on the table u will see the difference, but Tv from the sofa - Not
Coming from the same people that were saying that 4k is useless on anything less than a 4000" TV just a few years ago.
@@youlouv1234 you have a 24 inch tv great
Well I don't see the difference from far away(about 6 metres) on my 52" display
Just going to put this out there:
4K is great for work and writing documents, because on 1080 the letters are quite difficult to see. You can still make them out if the text is large enough, but having 4K means you don't have to squint or give a second thought to what a letter is unless it's a ridiculously small font.
Can’t you zoom in?
@@BedNN Yeah, but that's kind of a hassle, and you can't fit as much on your screen.
I think you have vision problems if you can't use Full HD monitors.
@@attag_ua I mean, I can use them just fine. I just prefer having more space to zoom out.
"I am continuously underwhelmed by 8K." -Linus
What sources are 8k?
Graphics cards can now render in 8k+
@@MrEddieLomax RTX 3080 can do just fine, assuming you can afford 8K screen.
8K is good for cameras because you can edit it down and not lose quality.
@@taranaditya2767 if you can afford the RTX3080, you’ll be able to splurge on a 8k monitor 👀
"Here's why beyond 4k is basically useless."
People who want 8k for basic entertainment: >:(
Its honestly a placebo
@@Dizastermaster. No, it isn't, depending on which route you're going.
if you're gaming (in the future, because 8k gaming right now is not exactly the best for modern PC games requiring DLSS or rendering in a resolution of 5k or lower) it'll make anti aliasing pretty pointless, and games will look amazing.
(if you're wondering Yes you *PROBABLY* can get playable framerates on *OLDER* PC games with native 8k, however my focus is on modern games and I don't expect you to run Devil May Cry 5, CyberPunk2077 or Doom [for example] in Native 8k in 2021)
Yes but the jump between 4k and 8k isnt worth the performance hit and a better (refresh rate, pixel response time, colour accuracy) 4k monitor will always be a lot better value for your money than an 8k monitor
How ever if you have a large Tv, then 8k might give you a noticeable increase in quality
@@Ah-ec5ch uh... perhaps in 2021.
but what about the future? cause yeah I see no one picking 8k over 5k, 4k, 1440p or heck 1080p, there are people who still game in 480p
360vr movies need at lot more than 4k to look reasonable. As time will progress, new technologies will require more data and more resolution.
We're talking about TVs, not VR
TV as we know it will eventually go away, as other tech becomes more convenient. 8k (TV or otherwise) isn't useless, it just is for most people right now. That's what this guy doesn't make clear imo
@@blahuhm6782 i doubt TV will disappear for quite a long time, even if classic TV Broadcasts will end, Smart TVs exist. And 8k is probably useless on a home TV, as 4k is on a smartphone (unless you use it for VR)
@@francescocastaldo7469 The resolution must evolve, for both filmmakers in their process and for end-user viewing in multiple ways. You can't detach the hardware capability from its' output. 16K Cam, 16K output. End of line.
@@AAvfx i highly doubt the end user needs a 16k video
Very intriguing. Been upscaling and testing out 4k/8k equipment recently. Thanks for the lesson!
🤣🤣🤣🤣🤣🤣🤣🤣
I remember people saying the same thing about 1080p... and 4k...
Definitely heard it for 120 hz more than I'd like to remember.
I still think 720p tvs look pretty damn good.
The human eye can only see 30 FPS
@@Un1234l i notice difference in 60fps than 30
@@Un1234l i really hope thats a joke
There is a use for higher resolutions. It allows us to make bigger screens and for the observer to sit closer to those screens without being able to distinguish individual pixels. So, if you have the same size screen, one in 1080 and one in 4k, there is a distance away from the screens that you can't tell the difference and you 4k becomes useless. However, if you get closer to the screen past that point the 1080p screen will start to lose clarity while the 4k still looks sharp. It's all about screen size and distance to the observer.
There is a huge difference in image quality between my 1080p and 4K on my 55inch TV. While "resolution" doesn't change, the bigger the Screen the larger the pixels (at the same resolution) which affects the clarity and acuity of the image. Not sure what this Guy is talking about.
i have the same thoughts. my 60 inch 4k Samsung looks great with good 4k content. The 4k feed from directv is way better than the hd feed on the same tv lol.
@@dawgpound4501 I am watching because of the title: Something about beyond 4k being pointless.
Netflix did some studies that showed HDR at 1080p was more satisfying to viewers than 4k, for typical 55 inch tvs. My 2c on this is that people now buy bigger tvs for smaller rooms and hence they can actually tell the difference between 4k and regular 1080p HD on 65 inch tvs and sometimes on 55 inch tvs. There's also the difference of image processing. The newer tvs make better choices about how to upscale a 720 or 1080 image to a 4k screen. I have 2 OLEDs from Sony and LG. The Sony and LG OLEDs have AI which uses a machine learning model to create the optimum up scaling. However no upscaling is perfect and if you look very carefully you could get artifacts in the image that indicate where maybe the choice wasn't ideal based on human vision. A 4k broadcast which was captured from 4k or higher material won't have upscaling artifacts. It should look more natural. Throw in HDR and the colour reproduction will be more realistic. It's a double win but not quite perfect. The latest OLEDs look so good because the match natural colour reproduction (sometimes pushed to be more vivid on the LG tbh) with 4k resolution and very high contrast possibilities. The black levels are much more realistic and the overall impression is more immersive and less like watching a screen. The machine learning even does object recognition and can make adjustments to neighbouring pixel contrast ratios to create more realistic depth of field. My missus thinks that with default settings the LG is more impressionistic whereas the Sony is more natural but you can tweak either to make the image more of less vivid.
If your watching TV that much then pretty much your a loser.
There shouldn't be any difference regarding pixel count.. as the ideal viewing distance would be around 30° from the top of your nose to the lateral boarders of the display.. What you see as a better picture is likely to be the higher bandwidth and wider color gamut 'included' with newer standards and containers.. Not resolution.
Me: "puts 4k to watch"
Laptop: you are overestimating me
Why did the algorithm bring me here over a year later, with no history of watching tech RUclipsrs, at 3am in the morning?
3am in the morning is redundant.
@@billweir1745 My brain cannot output proper English during those times
@@parker6918 haha fair enough
Bro same
It thought you needed a good laugh at this technically challenged moron
I kept waiting for the explanation and it was “that’s a lotta pixels.”
His "science" is never revealed because he was really just duped by his own self marketing tactic thinking he was actually smart
Thankfully I only glanced through the video, because that's what I thought was coming.
Basically "Why it's useless" is: because opinion.
I will see you at 4K vs 8K in a couple of years.
Thanks! This makes me feel less bad about myself not being able to afford an 8k TV, for now.
Me with 720p 👁👄👁
bruhhh WTF, you can have 4k tv and be set for life lol
hahaha same bro. now im happy with my 4k tv
bruh, 8K is more or less luxury products that few people can afford, nothing to feel bad about.
bruh i can see the pixels with my naked eye from a certain distance on my 1080p tv, as for a 4k one, it's waaay more sharper, and dont start with the netflix, youtube content, those have bad bitrates, youtube fucks the bitrate of a video so much you can't even call it 4k anymore. You should try watching some raw footage or a 4k bluray, that's where you rly see the difference between 1080 and 2160...
I guess his video will apply for most consumers. Almost no one uses bluray nor have 50tb media centers for storing a 80gb movie.
@MONOPLAY go ahead, teach me, i'm all ears master
I would be sincerly interested in learning about the issues you have with the RUclips and Netflix bitrates
4k is good at 60fps, FHD60 is still better than 4k30.
surely it matters when it comes to VR though, when youre that close up to a screen, high pixel density becomes crucial
I was going to say this. VR and Augmented reality systems. 8k is barely enough.
Yeah but that's different
Looks like VR keeps getting better until about 16k per eye.
VR is a gimmick lol
@@daniell5740 I'm going to bet you don't have one
“Beyond 4K is useless.” - B. Boolean
“No user will ever need more than 640 K of RAM.” - B. Gates
Apples and Oranges big time.
But he should of added to the title something like
Beyond 4K is useless on screen sizes smaller the 60"
There is no proof that Bill Gates said that. Not complaining, just saying that we, as spectators dont really know if Gates actually said that or not even though he himself denies that the quote is his.
what we need is all SD channels to become HD or 4k! :D
I don't even need to ask for streaming content higher than 4k
Until humans can upgrade their eyeballs that comparison is apples and oranges.
You're not really making a fair comparison, resolution and ram are very different things
It depends on the screen size and how far you view the screen from. People have been using 40-60 inch TVs as computer monitors when they were 1080p, with a much shorter distance than you are considering the average DPI of using 1080 as a monitor does not look great for reading normal text documents but still decent for video games and movies. For 40-60 inch size monitors 4K gets that resolution back to being somewhat better than standard monitor, but it is not till 8K they would be getting on par with cell phone pixel densities.
I remember when people used to say anything over 1080p was pointless, lol.
It was when 1080p first came out because of the tv sizes most consumers had at the time. It came out back when huge flat screen tv’s weren’t even around yet.
And 5 years later your goanna bring this video up when 8k is standard
For me, anything beyond 1080 still is useless
as do I and my Retina 5k is definitely better than 1080p
@@jhoughjr1 yeah. The difference is huge between 1080 and 4k. 1080p at 27" is a blurry mess in comparison
No difference between 1080p and 4K in terms of sharpness. Lol. OK buddy
Makes me wonder how well his glasses work
my old tv able to do 1080p better than my friends 4k tv cost more than a pc alone
old tv Cost about £200 and the smart tv £450
I think his statement stems from The example of a 47 inch tv. With that and say four yards viewing distance, there are surely more important aspects then pixeldensity. For a 70 inch screen with same distance, ofc the difference is huge.
And 4k is way better than 1080p aslong as it more than 60hz a 120 hz 4k will beast any 8k in 2020 until about 2022
You tech morons are pathetic, do you losers sit in front of screens all day or something.
People have brought this argument since HD days and it's beeb wrong since then. Your eye doesn't see in pixels.
No, its been right since then. Look up visual acuity.
I mean they both have a point. The increase in resolution only makes sense if screens keep getting bigger and the living rooms get smaller. Standard definition stretched over a 50" set 3' away is borderline gut wretching while a 14-18" set 6' away doesn't need to be HD.
@@hhs_leviathan Exactly.
"Your eye doesn't see in pixels."
Rod and cone cell in the eye: "am I a joke to you?"
I use a 48 inch OLED TV as a monitor sitting 3-4 feet from it and 1080p looks like garbage compared to 4K. 1080p looks fine when watching movies on it from 10 feet away though.
Kind of sad that he never said anything about ppi and why that’s basically what matters
I know right 🤷♂️
He is tech youtuber and there is science behind him.
So thats why Apple say phone this size 720p is same as 1080p. Eye cant tell. Then Samsung release 1440p as gimmick cause phone too small to tell anyways. hmmmmm. I learn alot.
@@KhoiBoa Yes, that is precisely what is going on. So 1080p on phone is basically like 4k on your PC screen cause of how dense the pixels are.
@@KhoiBoa When you look at your phone, you are looking at it from close distance. The smaller the distance the better you can see detail. So you cán tell the difference between 1080p and 1440p on lets say a 6 inch smartphone. If not, your eyes just aren't that good
Everyone: talks about 8K
Antialiasing: "Am I a joke to you?"
Yeah this guy gets so many things wrong, misses other things, etc... bad video overall, like the dude years ago saying we can't see in 4k, what dumb dumbs...
@@blahuhm6782 I play on 1080p from a couch. Anti aliasing barely makes a difference. So the problem, your eyes can't notice much of a difference between 4k and 1080p, especially on smaller screen sizes. It becomes significant on bigger TV's which take up half of your walls height. Proved by the fact most gamers never upgraded to 4k. 1080p will be the standard for the terribly tiny screen sizes most gamers play on
@@blahuhm6782 what did he get wrong or miss?
@@meghanachauhan9380 Once 4K 60 is more affordable it will be the standard. Obviously in gaming, no matter how tiny the increase, better visual fidelity is ALWAYS helpful. It’s a must in any sort of competitive gaming, those small increases can give you the edge over an opponent. The problem is right now, I’m not gonna sacrifice either 60fps to play 4K. And I’m not gonna sacrifice my wallet to game at 4K 60.
@@meghanachauhan9380 more like 1440p, so you get some margin
I was waiting for the “Science” part of this. Such a loose term these days...
Science: chances are you will not be able to tell the difference in quality and sharpness detail, reason is being there is millions and millions are packed in here.
Wow so impressive...
@@BerkeBoz Not really science. You kind of just made his point.
I worked at the UW eye lab in undergrad for a year. During that time we actually tested these sorts of side by side comparisons. We color tuned the monitors to 99% accuracy using the Adobe Color Space, set their brightness to be identical, and ensured that the pixel gaps for each monitor were as close to identical as possible (To prevent the screen door effect). At typical viewing distances we found people with 20/20 vision could quickly discern the difference between 1080p and 4k with greater than 80% accuracy.
Now if your vision has deteriorated, you'll have problems. If your viewing distance is greater than average, you'll have problems, and probably most relevant, if you just don't care about image sharpness, and aren't looking for the differences, you may not notice them. Interestingly, while people sometimes don't notice increased pixel density, they are very sensitive to decreased pixel density. So someone who has been viewing 4k images for months, and didn't think the difference was large before, now suddenly subjected to 1080p video, will notice a stark decline in sharpness, this is double blind mind you.
Realistically the sweet spot for most human vision tends to be at something between 1080p and 4k. Maximum visual acuity is around 35 arc seconds, with 20/20 vision falling between 35 arc seconds and 1 arc minute. That said, if the pixels of a device could occupy our entire field of view, the resolution required to ensure no artifacts and maximum use of our visual acuity is a little above 20k (200 degree field of view divided by 35 arc seconds is 20,571.43 pixels).
Realistically, that's not how the human eye works. Only the fovea is high resolution, so if a TV were to actually simulate human vision, it wouldn't have the same resolution across the entire panel, we would only stare at the center of the screen, which would be 35 arc seconds resolution at whatever distance we were sitting, and the resolution would sharply drop off as you traveled to the peripheries of the screen. For the purpose of VR this is useful to know for foveated rendering, but the resolution of the entire screen must still be at maximum fovea resolution since the eye rotates independently of the head.
In addition, a lot of our visual system is neurologically mitigated. This means for complex images that aren't faces we have an effectively lower resolution of viewing for most people because of how your brain processes images. For faces your brain creates a truer to life representation requiring higher fidelity, and for abstract single objects the resolution is maximized. A single pixel white line slowly rotates on a 4k screen will show obvious visual artifacts called aliasing, because our brain and visual system is capable of picking out the pixels and obvious artifacts of such a simple figure. Artifacts will in fact continue to be visible for people all the way up until 16k for people with "better than 20/20" vision, meaning the best vision available. For people exactly at the threshold of 20/20 vision 10k may be enough for them to cease to see artifacts, since their resolving capability is at an arc minute.
For most visual patterns though, the brain simplifies, tending to cluster together objects to make colors, patterns, shapes, and identifiable figures. Representations are created that aren't really there for peripheries. You think you see in color and in detail things out of your peripheral vision, but you really don't, it's just your brain filling in the gaps for you.
In any case... "generally" for most people, particularly as you get older and your vision declines, and if you have your TV sitting 10' away across a living room, 1080p is more than adequate... If you have good vision and you sit within 5' feet of your TV, this video becomes relatively inaccurate. If you are playing in VR and the screen is an inch from your eye, this video is totally irrelevant because it did in fact not explore any of the science of visual acuity at all, it just made general statements about resolution that have nothing to do with eyesight.
@@dragoonsunite bro, this was an awesome study! Thanks for sharing! These were the kinds of things I was looking to learn in his video.
@@dragoonsunite the hell
@Daharen Wow this comment is actually really interesting and useful in contrast to the video I just watched^^ thx mate keep up the good work!
0:06 Video begins
Is “I’m a tech RUclipsr” a credential these days?
He said that to prove his point, not to try put his opinion on a pedestal for you
LifeAccordingToMayo well whatever point he’s getting at is misleading since he didn’t mention pixel density and tries to fabricate a discrepancy through “marketing.” As if TV manufacturers don’t disclose their actual pixel count and aspect ratios before you buy
Noah Mathis he didn't mention pixel density keeping in mind viewers like you would already understand what he's getting at. He mentioned the 47 inch 1080/4K TV vs the 110 inch 1080/4K TV to explain to the average consumer that images become softer as pixel density decreases. And also TV brands do mislead with their screen sizes as they include the bezels in the measurement
@@xKingston111 that's true. It's just like making price 59.99 dollars instead of 60 dollars for eg. No difference at all but still 59.99 feels less. Marketing is everywhere, and whatever he told i already knew that so he didn't say a penny's lie. Except 16k is a marketing hoax lmao.
Probably more than being a youtube commenter
7:53 as he said that I checked my YT settings and it was on 720 cause its on auto. When I switched to 4k, completely different person lol.
3840 is awfully close to 4000 and it makes perfect sense to be called 4k tv. Hell, what we call 2x4 stud in construction is actually 1.5x3.5 inches. I don't think anyone feels ripped off:)
I own 48" 4k tv and first time I've noticed a huge difference in standard and 4k settings is when I played HZD . It's noticeable.
yeah right , its clearly noticeable.
I'm still waiting for the scientific part you mentioned at the beginning
"The human eye will barely be able to tell the difference" - He didn't even explain why human eye won't be able to tell difference on 47" but would be able to tell on110"
@@TheJwwinter because of pixel density
@@TheJwwinter small ting look little. big one look big
@@xionova3254 Thanks. Perfect explanation. It all makes sense now.
His last few remaining brain cells didn't make it to the end of the video
Everybody every time there's a new standard resolution: "Let me explain the science about why this is pointless."
Remember when people said your eyes can't notice the difference between 720 and 1080p and then 1080p and 4K? I do.
He said it is pointless on certain ranges of TV sizes. 10-15 years ago 32" was the norm... today is up to 55"... In the future, who knows?! Maybe a 100" screen or projector...
They said this when screen sizes of the average TV and desktop monitor were much smaller. The American broadcast networks went to 1080 in the 1998-99 period. The average LCD TV at that time in the United States was 24 inches. At that size, there's no significant difference between 720 and 1080 for average viewing distances for TVs.
The consumer 4K standard was set in 2012. At that point, average TV size was 38 inches, and no significant difference between 1080 and 4K at that size for average TV viewing distances. Let's use a 42 inch screen, since that's a pretty standard size. A 42" 16x9 screen has actual dimensions of 36.5 x 20.5 inches, giving, at 1080 resolution, a pixel size of about 0.019 inches. If you're sitting 8 feet away from a TV of that size (not unreasonable), each pixel would have a visual angle of 0.0113 degrees. At 6' feet away, 0.0151 degrees.
The problem is, a person with 20/20 vision has, at that distance, a visual acuity of only 0.0167 degrees. This means that a 1080 42" TV at 6 feet away is, essentially, at the limit of human resolution for that person. If you had a series of alternating fixed black and while lines on that TV screen each a single pixel wide, it would be _just_ detectable that it's a series of lines and not simply a solid gray screen. If you doubled the resolution (to basically 4K), each pixel would be, at 6" away on that same size screen, 0.0075 degrees, which is impossible for a human at that distance to make out. The series of alternating lines would just be flat gray. The only way you'd get the detail back would be to double the size of the lines...which is taking you back to 1080. And this is before taking into account image movement which TV usually involves, which would cause blurring and loss of such fine detail anyway.
You can do this for other screen sizes. On a 60" TV, 4K is still not quite in the range of (normal) human visual acuity at 6 feet away. It's close (0.011 degrees), but back off to 8 feet and it drops to 0.0084, or about half of the necessary angle. Which means that on a 60" TV, if you're watching it 6" away you can perhaps just find it a little sharper watching 4K rather than 1080. At 8 feet away, there's literally zero difference. You might, as was pointed out, perceive it to be clearer but that's more likely a function of cameras and the TVs having things like better colour depth and filmed at faster frame rates than it is the increase in resolution.
This is also why this doesn't apply to computer monitors. You sit closer to a monitor than a TV, and the images are more likely to be static at least part of the time which allows you to actually detect the higher resolution.
Waiting for "Why beyond 32K is useless"
Chup lag muji
anything after 16k is useless cause we see in 16k
@Juan Perez, his comment made sense and yours didn't. There's a limit to the resolution of the eye. You can drive a vehicle without using your legs, but you can't view a screen without using your eyes. This is why the limitations of the eyes are relevant and the limitations of the legs are irrelevant.
@@seriouscat2231 there is not any limit on the resolution a human eye because the eye does not see in resolution. Its the same with frames per second, the eye does not see in frames.
16k will be useful for vr then
First off, I love how he says there's a scientific explanation about it and then never talks about the so called scientific reason.
Second, I don't if is just purely trolling or he really isn't able to see the difference between 1080p and 4k.
And third, just by knowing a little bit of color theory and how it affects cinema, is very hilarious that he is saying that the thing companies are doing to "make it look better" it's just enhancing colors...
I don't know about 4k vs 8k maybe there you can not see a difference, but 4k vs 1080!?
How close are you too the screen? In order to see 4k @ 47 inch TV properly, you sb back ~ 67 inches that's a 0.01 inch eye resolution. If you sit closer say 3 feet you can see a difference but you cannot see the whole screen easily. Now if your TV cannot process a 1080 signal that's another problem.
I have a 4k 32" display monitor and a 32 1080p TV and I don't see a difference between them when I use the monitor as a TV (so sitting in a distance). But the difference is very clear when I'm near it.
It all depends on size and distance. For me it is very clear that as a TV there is absolutely no difference between 4k and 1080p on a 32" screen.
Of course the images itself are different because the monitor is IPS 144hz and the TV is a VA panel (I believe it is 75hz but not sure) , but those are contrast, colors and refresh rate differences, not resolution.
But it’s all relative to the distance at which ur viewing the screen, apart from slight color and contrast refixtures.
Exactly 4k is literally 4x 1080p lol his glasses clearly need replacing if he cants tell the difference.
Yup, 3/4 of the video is just presenting the resolutions and 1/4 is the tiny explanation, which is the most important part and he’s not elaborating on it. More importantly, the distance from the tv is arguably the most important aspect. I guess this video is for the complete beginners but even then, some thorough science could help them understand better.
Why are you saying “it’s happening 720 times” instead of “there are 720 rows”?????
Makes you wonder how many people don't know the difference between a row and a column. But I suspect that's why his explanation is so dumbed down, because there are people that don't.
Yeah there is so much wrong with this video
Noticed that too, his explanation sounds overly complicated and unusual to me
He's just not very smart about this subject. Probably fails at other things in life too
Cuz
Did i just hear 1080p will look good at 110 inches. What of 20 years ago people said theres no reason to ho to 720p. It doesn't take a genius to work out how pixel density and resolution work
Me watching this in 360p :
Ahh yes.
144p*
@@jackbenton2892 64p
Aussie internet rise up
144
144p and buffering 👍
Who remembers watching 480 on a big 56” rear projection tv? We thought that was the ultimate!!
You could also have added ppi to indicate the clarity or the detailing..
Eg phone screen 5.5inch with full hd screen has a ppi of 400+
But a 40 inch full hd screen has much lower ppi..
I hope you get the point.
4k is good and there are quite noticeable differences over 1080p on a laptop, but I'd like to have 8k even if the differences are minimum -- because going 1 generation beyond our eyes' ability means we're at the pointless endgame of the Megapixel Race. :)
Anything less than an 80 inch screen is pointless for 8K. If you don't have the room or the desire for a massive TV, 4K is more than sufficient, and overkill for many. HDR is the real game changer, not higher resolution; our eyes cannot resolve increased resolution past a certain point.
Its never detailed enough.. real life is so much better then 8k and i want it to keep improving
@@xpodx no real live its full hd that what you see full hd. 4k and above are higher then real life, the human eyes have limits so over 8k its to drop money to the garbage and anyway you need a super big size screen to see the full impact of high resolution and most people dont need 90 inch and above so not matter from where we look on it over 8k its useless...
@@מוטיאהרן Everything you said was wrong
@@xpodx science its not wrong just stop tell yourself stories because you are a punk of high resolution or a seller of this tv. Everything about it science alredy answer, you can find professors optic talking about that over 8k its bullshit and pointless becuse the human eyes have limits end of story....
@@מוטיאהרן you can clearly see more pixels.. its not bullshit..
Pixel count is overrated, the industry should work on making HDR mainstream instead.
YES
Considering 4k is already affordable now, and 1080 is great for medium to small displays, you're 100% right. How bout some dynamic range? After all that's what'll really give the image life, our pixel counts are plenty enough.
Not for gaming
@@fabiank4396 ? U know what hdr is
We can do both things at the same time, but I agree things like HDR and refresh rates are more important right now
Yeah, and "anything above 60 frames the human eye can't perceive."
People tried to tell me it was 24fps lol
@@AnywhereMiami 91
@@AnywhereMiami no it's NOT "a fact". The Navy has published studies of pilots being able to see images flashed at a single frame way higher! Simple Google for the paper should do it. Truth will out
I have a 144hz monitor went to a game where i can control the fps cap, made the cap 144fps looked normal then went to 60fps and it felt sluggish and laggy. There is definitely a difference
@@williamwillaims Are you a pilot? No then probably you can't see it
As many point out and can’t make the distinction, FOOTAGE in 8k is quite amazing, however displays in 8k are useless. Honestly shooting in 8k to 16k and above gives tremendous advantages in the editing room when needing to do serious cropping and zooming (computer intensive granted).
Just distinguish the 2 differences between footage and display.
When he said p stands for I was like yeah yeah I know but when he said progressive I was shocked what not pixel but progressive... Learned something new
Yea and 1080i(which most cable and satellite shows) the i means interlaced. Some people say 720p is better tyan 1080i. Reason tv shows 1080i and not 1080p is because of the bandwidth required. 1080p is what is on bluray
@@whydef269 no 1080i had to do with coming from the interlaced SD world and was to do with motion flow and to remove jitters. 24p physical film was double shuttered (i.e. the frame would flash twice on screen before advancing to the next frame), this was to help with flicker removal. The reason why 1080i was a thing for broadcast was mostly to do with sports, they usually broadcast in 720p or 1080i and interlacing mostly had to do with 1/2 of 60Hz electricity, i.e. 29.97fps.
You youngsters have a lot to learn
@@beachcomberfilms8615 everything ive read as to why cable and satellite companies dont stream 1080p has said it involves too much bandwidth and costs much more. Yes there is movies and ppvs that show on 1080p but thats different since u have to purchase to watch it. I havent seen what you said as to why it originated so that might be true but i was speaking why they still havent moved on from 1080i. Most cable companies i mean.
I remember 1080i..
@@peterpan408 RIP. I hate interlacing - it was a necessary evil for the old CRT TV days, thank God it's obsolete. Now if we could only get rid of the need for uneven frame rates in the U.S. (29.97, 23.976, etc.). and drop-frame timecode, editing would be a much cleaner task.
Take a shot everytime he says "marketing garbage"...
I did, and im mkk;mslkohjk!
@Ford Simpson you can use very large glasses for the shots...it helps...
Sure...and it's true...marketing garbage. Just like those screen sizes, especially so with computer monitors. They shave a half inch or more and still upsize the number in the Ad.
Dont get me started on aspect ratio...that 16x9 over 16x10...a rip off. Pawning off less screen real estate.
Marketing Swine.
He is the garbage
You make this sound much more difficult than it is.
Why does it take around 3 minutes to explain 1920×1080 is the number of pixels?
I honestly thought that I was going to learn something today, I learned nothing. He just ended up explaining extremely basic information. He never even got to the point as to why 8k TV would be pointless, as pixel count increases so does your PPI. This allows TV manufacturers to produce much larger tv dimensions at minimal fidelity cost.
fr, he really took 5 minutes to explain to me 3rd grade multiplication, and he made the point of saying the "p" in 1080p didn't stand for pixels, but didn't bring up 1080i or what the "p" actually means , and goes ahead and shows you it's the amount of pixels on one side of the screen, which makes it look like "p" means pixel
@@matthewdominic4336 basically he meant that 8K for Consumer TV's is the same as 1080p But depending on the size like cinema's It would be more useful for that
@@DISDATRONA If only I could agree to tht, it would make buying TV so much simpler and cheaper. The problem is tht I do see stark difference even between 1080p and 4k, and minor differences between 4k and 8k.
I agree that more than 4k is kind of useless on a TV. But to say that 1080p and 4k are almost the same on a 47'' is straight up a lie.
Agreed, this video is garbage. Unless you have a 20" tv or are sitting at the other side of the room, there's going to be a big difference between 1080 and 4k. What is this guy even talking about. Even 8K is highly noticeable if you have a large TV.
True. I have a 55" TV and I sit about 2 meters away from it. I can tell the difference.
1080p is better if it got 2x the framerate. 30fps is a bad joke at any resolution, even 60 is low but a bigger improvement than 4k30 vs fhd30. We should aim for 120fps minimum, but samsung and lg want it different, so we are stuck with high res that is blurred down to low quality through motion blur.
@@pflaffik I agree on that for gaming. But recently I've been watching TV Shows and Movies on high refresh rates and it kind of sucks. It may be because we're used to the "cinematographic" 24fps, but I don't really like high refresh rates on TV content. They all look like soap operas.
Reminds me of a Curcuit City employee telling me LCD was useless when I wanted to upgrade from my flatscreen
"I'm saying that as a tech youtuber" - like that's a credential 🤣
Samsung phone s21 proves
No you are an idiot, he clearly meant that as a tech youtuber he should if anything be over enthusiastic rather than sceptical
Seriously. Can't even focus on the video topic. It made some good points.
Im confused. Is it a joke or a hate comment? I think it's a joke
like MKBHD ....haha
On my wacom 24 cintiq pro I can definitely tell a very clear diffrence between 1080p 1440p and 2160p resolution when I'm drawing or playing a game.
100% matter when it comes to drawing tabs xD
My laptop has a 4k 15" 16:10 screen. I never watched videos on my old laptop because it just looked terrible with fullHD. I think 4k is good for 13" and above but is absolutely necessary for 15".
oh yeah full hd looks so terrible lmao
"640 kb of memory should be enough for anybody"- Bill Gates
Totally different things
We had to increase the size and speed of our computer memory to match all the development in other parts like the cpu and the gpu and the primary storage.
But i don't think our eyes are gonna evolve fully to appreciate 16k or above or even 8k in our lifetime
lol I was thinking about the exact same quote whilst watching this. Thankyou
Fake quote, please don't repeat it again and again
He wouldn't dare say it, now. 😂
@@fahis500 you're dam right they're not going to evolve😂 evolution takes place over thousands of generations, you cant evolve once you're alive. Only offspring have the chance to evolve and there's no survival need for our eyes to see in any more detail. Your eyesight might change over time but that will only be for the worse if you're staring at a screen all day
I love how every time someone releases a video regarding basic facts, it's bombarded by 400+ OPINIONS. Guys he didn't even mention the impracticality of 8k streaming where 6GB = 1minute of video. I have no problems purchasing something I have a need for, I simply don't need anything beyond 1080 at the time of this comment, as I don't have an overwhelming desire to turn an entire wall into a screen. If that's you're thing though good for you.
MrBaxtrax I’ve got a 75” 4K. Looks good. I can’t see anything beyond 16K, because no one would have room for a TV capable of illustrating the difference between 16K and 32K at the necessary size.
Jeremy Broussard try beyond 8k. A cinema couldn’t really make use of anything beyond 8k. A home doesn’t have use for anything beyond 4k.
Only reason why I'm planning on getting a 4k oled tv is for the hdr mainly and not the 4k
@@shadowhunter0 Wish they made 1080p HDR screens. I have no use for the 4k resolution since I have nothing that plays at 4k to even utilize all those pixels.
@Jalau 16K+ is definitely useful for anything above 40", just not at this moment. By the time 16K is mainstream, people will be using screens as replacement for windows with their own artwork/pictures being displayed. At that point, much like artwork, people will want to be able to stand close and admire the increase in picture quality. The way we use screens today is not the same as in the future.
Me keep changing from 144p to 1080p seeing if there’s a difference while watching this vid
Bet you can't go 16k for sure.
why tho? How would there not be a difference it's like 9x the pixels.
@@phantomnarwhal164 you can’t rlly tell
@@obeebemahasee1130 are you legally blind perhaps?
144p helps blur this guy out, which is the appropriate choice because what he's saying is so controversial
Sorry but you're wrong,. The human eye can easily distinguish bewteen 1080p and 4k or even at 8k. At it's peak the human can see 576,000,000 mega pixels (better than 8k!). Maybe it's because you wear glasses and are visually challenged that you can't see the difference that others with healthy eye sight can. I had the privilege of seeing a display at a technology expo and I can tell you the difference even between 8k and 16k is phenomenal!
Seriously, they said the SAME EXACT THING for 4K.
even said it about 1080 in thr day
They were right. Look up visual acuity.
No one said that tho
I don't think so, at least not like that. Most likely it was because TVs between 40-50" were the majority and everything much bigger like 70-80" were pretty expensive and therefore also a niche. And on 40-50" you usually don't really benefit from the resolution increase, as 1080p is already very sharp at that size with common viewing distances, but as UHD ("4K") TVs came with other technological advancements and also got more affordable, ofc they became the mainstream with 1080p TVs being mostly dead.
Now the average TV size is most probably higher, but is it big enough to warrant the need for 8K (is there a proper name?) TVs? The size would need to double for it to be worthy, and we're not at 80-100" being the norm, it's probably more like 50-60". So, for most people 8K is really useless, it also doesn't seem to come with other advancements AFAIK.
Is 8K resolution itself useless? No, for production you want to have higher resolutions than the consumer receives, to have headroom processing the source without it degrading the quality in the final output, like high bit-depth (24bit, 32bit) for audio/music, but that's a different topic.
@@dennisjungbauer4467 Average tv vs. viewing angle even today and forseeable future isnt even big enough for 4k to make a difference.
It depends on how far you are from the screen and on the size of the screen
Appreciate the explanation and breakdown of the numbering system regarding pixel count but you can’t say that there isn’t a noticeable difference between 1080 and 4K. Perhaps not in terms of casual television but as far as graphics are concerned while gaming, the difference is very obvious.
People once thought beyond 4kbps is useless and here we are.
I think the cost is a driving factor. A few years ago people were telling me not to record video in 4K but I could tell the difference and glad I did so I am not re-doing all those videos as monitors get larger and 1080p looks worse. Higher Ks will eventually allow us to move our monitors to full wall and we get more desk space. Office meetings will be planning on walls. Higher K brings more options on larger screens, but the price matters.
WELL PUT!!!;)
Why is it that I always hear people say “any more than 4k is useless” and “anything more than 60fps is useless.”
It’s complete BS. You can tell the difference between both
Because they are idiots . here is the math . 100 inch 8k panel is the same as
Four 50 inch 4k screens . Both setups provide 32 million pixels . The base line resolutoin is 50 inch at 4k which is Exactly the same as 100 inch at 8k,...
which is also the Exact same as 25 inch at 1920 x 1080 . This pixel size and pixel pitch\gap has been the base line now , for many years . If anyone wants super high resolution , then look at your 5 inch phone which might be close to 16k ( but it's only a 5 inch panel ) you will need a microsope to see the pixel gap.
more accurately over 8k is useless, and over 120 fps is useless
@@VYDEOS2 120 fps yeah . However , 8k does have a purpose .
100 inch 4k screen is gonna have really large pixels .
To make the 100 inch screen have the same pixel size as a 50 inch 4k screen
requiares 100 inch at 8k .
Larger screen size Drives higher resolution .
@240mains that's what pixel density is. 4k on 100 inch screen will look bad because there aren't a lot of pixels per square inch, since the pixels are bigger. However, 4K on 20 inch screen will sure as heck look better than 8K on 100 inch.
They should start measuring screens by pixels per square inch rather than how many pixels. Apple Watch screens look nice even though they are only 360p technically. And yet your 80 inch 4K tv doesn't look so good.
So basically, anything higher than 360p is useless on the apple watch screen, but not on a TV. Most people don't own a TV big enough to see a clear difference between 4k and 8k. Until everyone starts owning hundred inch TVs regularly, 4k and 8k are virtually the same.
And then there's distance. If you view close up, you will see the pixels, and the image will look bad, but from further away, the difference between 4k and 8k is even less
Shouldn’t it be called 2k then? I mean the name is misleading as it starts counting the columns and not the rows like 720 and 1080
1920x1080p, or the equivalent theater resolutions, are often called 2k because of the "1920" approximate two thousand pixels.
I remember hearing this when 1080 went to 4k. Two TVs side by side can 100% see a difference. They also say 30 fps is all the human eye can see but any gamer will sit in front of a screen and tell you the difference between 60 and 140 fps without even knowing the actual fps
Me watching this in 144 p
Finally someone gets me
I literally got recommendation of "why 8k is not pointless"after this video
Lol
lol
this video: exists in 4k
my wifi: (softly) dont
I did it. It worked.... After ten minutes.
@@leavemealone2154 lol
Mine starts breathing heavy at 480p😭
@@gontsezachariagalane1720 wow
I find it strange that between 1080p and 4K they switched the side of the screen they are measuring. 1080p is 1080 pixels on the vertical side and 1920 pixels on the horizontal. 4K is 2160 pixels on the vertical side and 4096 on the horizontal. Yet, 4K get it's same from the amount of pixels along the horizontal side, whereas 1080p gets its name from the amount of pixels along the vertical side, which is considerably less. Using 1080p's logic, 4K is actually 2160p, but using 4K's logic, 1080p is actually 2K. It's almost as if someone wanted 4K to sound like a bigger jump from 1080p, so instead of using the same scale and showing it to improve twofold, they switched and made it seem to have improved fourfold. Unless I'm missing something idk
TLDR: look up pixel density lol. Saved you 10 minutes
Arc seconds if really want to understand the relationship between DPI and distance.
You said exactly what I was thinking. Wtf is this guy on?
Three of u are humans so there's nothing to shock that you guys may jinx
2:50 "so imagine" no need to imagine. just move really close to your screen and you'll be able to see the pixels
When we start getting displays of >100", it will matter...
It already exists but I don't own the wallet to go with it.
They do exist. Go to virtually any sports arena/field and look at the "jumbotrons". At that size, 8K COULD be discernable from 4K. On a normal home entertainment TV, at the average size, at the optimal viewing distance, the human eye will not be able to tell the difference. But he means home displays most likely. Which at 100" is too costly for most. That's either a theater room with a projector or a very large empty wall with a room large enough to accommodate. Again, most people are limited by either. So, for the most of us, 8K isn't realistic economically, and even if it were, it's still not realistic practically.
A 100 inches screen will still not need 8k or 16k I believe because the optimized viewing distance will be like higher and thus the apparent size of its pixels will be the same than a 4k TV 2 or 3 meters away.
The question then becomes though, can you afford a house with a room beg enough to make use of that TV?
@@xxJing the house walls,ceilings and floors will all be made of screens, making 8k too low actually
If you wanted to make the case that the difference between 4K and 8K may not be very noticeable by the human eye, I think most of us would agree with you on that….but to say that the difference between 1080p and 4K isn’t noticeable, or that it is barely noticeable……I’d say that’s a bit much.
Also, 8K and 16K aren’t useless.
8K and 16K makes TV’s over 100 inches possible, without the need to switch over to a projector.