@@WalnutOWnah we know how to make them. Not as if the over century worth of R&D just up and disappeared. The community behind CRT's would somehow have to convince a company (like Sony or Samsung) it would be worth their time to re-invest in production.
One of my goals it to make a 4k 200hz crt. Would require a ton of curcuit design work but I think its possible. As far as I know, most crt monitors could display higher resolutions, but the circuitry inside driveing the electron gun wasn't designed for the inputs
I am crying right now. My dad used to have a huge ass Sony CRT TV in the basement which he threw out a week ago. Now that I watched this video, I found out that it was the Sony KV-40XBR700 which is one of the Holy Grail's of CRTs.
Damn, man. Sorry to hear that. One of the best for playing retro consoles. I know what it feels like because my dad threw away a ViewSonic desktop CRT like 5 years ago. I feel sick every time I think about it.
Don't be too upset. The KV-40XBR isn't that great. Anything that's not 1080i is scaled and adds quite a bit of lag, and doesn't look that great. Basically all 1080i content is 16:9, and the 40xbr is 4:3. So you essentially are guaranteed to either have lag, or letterboxing. It is by no means at all a holy Grail, and honestly is pretty awful for gaming in general unless you are going deep down the scaler rabbit hole.
The difficult part isn’t to believe the superiority of crt… the difficult part is to go to a store and buy one. Not to mention find a shop to service an old one. The last shop that I knew that they did it… has been closed for a couple of years.😢
Getting your hands on a working CRT is difficult but the very next video I'm currently working on covers some of the most effective ways of getting a CRT. I didn't mention it in this video because it really was beyond the scale of what I wanted to focus on. But expect a video soon that covers this.
@@GTXDashI don't think it's hard to get a CRT, it's hard to find a high quality CRT. Ideally you'd want something capable of pushing 1080P+ and as of right now you'll pay hundreds to get ahold of one
Decent summary, but a couple things: 1. CRTs do have a native pixel resolution, so to speak. The dot pitch of a CRT determines how sharp the pixels will map to the phosphors. You divide the height & width in millimeters by the diagonal dot pitch, then divide those numbers by 1.25 because ideally, 1 pixel maps to 1.25 phosphors. Past that, you get the natural anti-aliasing as you described in the video. 2. CRTs do have input lag. A CRT at 60Hz will have 8.335ms delay in the center of the screen and 16.67ms at the very bottom due to how they draw scanlines. By comparison, some of the highest-rated flat panels on RTings measured at 9ms. That's only a difference of 1ms, which is less lag than the difference between certain USB gamepads, keyboards, or mice. 3. LCD bilinear filtering can mostly be avoided by using integer scaling. 240p, 720p, and 1080p all cleanly scale up to 2160p. 480p unfortunately does not, but the black bars with letterboxed 1920p are no more distracting than watching 16:9 content on a 4:3 CRT.
Great video. To my understanding, the reason CRT computer monitors have more visible flicker at 60hz than their television counterparts is due to the choise of phosphors. Shorter persistence phosphors dont leave as long trails, but result in more visible flicker.
Sub pixel light scatter is something no other technology has been able to recreate... there is something so realistic and true to the light that emits from the phosphorous grid... skin looks like skin in a way that has depth, unlike oled or even led. It's like comparing the light of an incandescent light bulb, with an led bulb that runs the light at the same colour... one fills you with warmth (yes, physically also, but... not my point), and the other just feels cold, even though the colour of the light is warm. Incandescent light bulbs emit more light in the spectrum... making it also healthier for us, and... I wonder if light from a cathode ray tube also has that...
You're referring to the natural incandescent nature of CRTs, which is incredible, but you are mistaken on it being exclusive to CRTs, the last few generations of Plasmas exhibited this quality quite strongly, no as much as CRTs, but especially the last two generation of Pioneer Kuro Plasma displays had beautiful incandescence.
My dad was repairing CRT TV's and monitors, so I see many different models and pictures. When LCD start taking over, I never understand why people were changing from CRT to LCD. Picture quality was so much worse but people just did not care. Thanks to my dad, I have 4 CRT Tv's saved.
i’m using CRT monitors since 2011 when i discovered hertz. i have played modern titles on ultra preset on my gtx 980ti. i play competitive games too. the research this guy has done is 100% accurate. i am well experienced now about new modern displays and CRT monitors. i use my sony 20inch CRT monitor than my 2k 160hz Lg ultra gear.
I hate that they didn't keep even one factory running...WE GET FREAKING TN PANELS that are basically e-waste, but not CRTs which are basically required hardware for retro gaming and TV
@hunn20004 They still make tube over seas. There may not be CRT repair techs per say, but specialized video game and pinball repair shops will often re tube them and do the alignment/calibration work. It won't be cheap, but the local pinball shop I work at has started doing them due to increased demand, as we are already used to working around high voltage electronics.
I didn't stop using crts because of the size I stopped because they were headache inducing and super washed out and hard to even see unless the lights were off. I had to spend 8 hours a day working on a crt for 10 years and the introduction of lcd monitors was the best thing ever for me. For gaming the only thing I ever cared about was input lag, but that was only a problem like 10 years ago it isn't a thing anymore.
I think the answer is in gray to gray ghosting. LCD is ghosting because of the time it takes to change a pixel from one color to another. Sometimes the time it take a pixel to change into whatever the next frame requires are longer than the time it takes for yet another frame instruction to arrive.
@@GTXDash What I was thinking, is that there might actually be a much larger delay than a few milliseconds per frame then what we test. For one, a detector might see the time between one moment and when there has been enough change for it to detect as another frame, but the pixels haven't reach the desired color yet. This might not in itself be as big of a problem if the being recreated is only 23.97 fps since the the time between ec frame is so long that it will reach the correct color, but trying to pump 300 fps of fast moving pictures through a LCD panel, one could imagine the picture could get a little slurry. But ultimately by seeing this transition taking place before your eyes instead of only seeing a crisp image stropping, could also add to the blurry effect? CRT pictures don't have the problem of particle filters that physically have to change alignment to block light, it will only have the aspect of the rise and decay of the fluorescent light when it's activated, which I think only translate to overall perceived image brightness? There is also the whole CRTs being additive light and LCDs being subtractive light which add to the magic experience of our beloved CRTs.
I wonder how awesome crt monitors would be today if they were still in development today. It would be awesome if there were brands that would take up manufacturing crt monitors again and put time into developing thinner and lighter than they used to be would be cool if it was possible.
It's a nice thought, but the fundamental way in which a CRT works means there's a limit to how thin and light you can make one. Keep in mind that the picture is drawn by a particle beam fired from a single emitter at the back of the tube. It's basically the center point of a giant sphere, with the screen being the outside surface of said sphere. The thinner you make it (that is, the smaller the radius of the sphere), the smaller and more curved the screen has to be to get a legible picture. All this has to be done while maintaining a vacuum inside the tube, and I'm pretty sure the bulk of a CRT's weight back in the day already came from needing to use materials strong enough to do that. Could we improve them with the technology we have now? Probably a little, but I imagine part of the reason we stopped developing CRTs is because we were already seeing diminishing returns.
I remember when we all started to transition to LCDs from CRTs back in the early 2000's. Sure LCD's were new, sleek, and opened up your whole desk, but it was generally accepted that they looked way worse. The biggest issues were the worse black levels and the motion blur. There really wasn't much of a competition though, the heft of a CRT was just untenable. We all just stopped recognizing that CRT's even existed.
Awesome video! I only have a couple things to add: Reguarding your comment that CRTs have next to no latency, the latency to the center of the screen on a CRT is affected by the scanout behavior. Since a CRT scans the image line by line from top to bottom, the latency to the center of the screen would be approximately half of the frame time. At 60Hz, the frame time is 16.67 milliseconds (1000ms / 60Hz). Therefore, the latency to the center of the screen would be around half of that, approximately 8.33 milliseconds. This accounts for the time it takes the electron beam to reach the center of the screen during the scanning process. This is not much faster than most gaming monitors at 60hz if also measuring to the center of the screen so this is not really an advantage of CRTs. And 360hz+ gaming monitors will still have lower latency than basically all CRTs. I still think that impulsed display technology (CRT or LCD/OLED BFI) is a bandage for truly 'retina' motion. I agree with your point that it is impossible to run most modern games at extremely fast refresh rates natively. But for motion to look realistic, the gold standard should be native 1000Hz+ without any strobing. I don't think this is unrealistic to achieve in the near future either. Lossless Scaling already supports 3x AI frame generation in any game to make this easier. Just because 60Hz has been around since Atari does not suggest that the gaming industry is not undergoing a paradigm shift towards higher refresh rates in recent years. We already have 480Hz OLEDs, which is already 50% of the way to retina motion clarity. I think that with the help of AI frame generation and other technologies, the future will be 1000Hz+ OLED/micro-LED displays without any BFI or flickering, and this is probably not as far away as many people think. BFI and CRTs are just bandage solutions to make lower framerates look good, and they inherently cannot fix temporal aliasing (wagon wheel effect). Only true high-Hz can fix that.
@@GTXDash LG C1/G1 are the latest OLEDS that support 120hz BFI; even then, they have a (slightly) lower motion resolution than the CX and GX models. Following models (C2, G2, C3, G3 and etc) have BFI but it does not work in 120hz mode. No other major OLED tv manufacturers have good BFI modes. Aside from some oled BVMs, that's as good as it gets.
@@Heymisterbadguy The retrotink 4K has full-fat hardware accelerated 120Hz & 240Hz BFI with HDR injection, so there is no brightness/luminance loss when using BFI, it is the best BFI algorithm to date, except for Sony's RBFI in their BVM OLED monitors, as you already stated, which is able to get smooth motion @ just 60Hz.
@@Wobble2007 well it's BFI is really really nice but i dont know if it can reach the motion resolution of the C1/G1, which is not just double motion resolution (like the 4k would be since it caps at 240hz) Also, the retrotink in 240hz caps at 1080p
I remember when LCD monitors started to become widespread in offices in the early 2000s. Boy were they shitty. Grainy, blurry and un responsive. They only made the beancounters happy, due to less energy consumption (also for the air con) and more real-estate on the desk.
One thing that this video doesn't mention about disadvantages of CRTs is their power comsumption and influence of "environment" on the image quality. I remember from my childhood that CRT in comparison with even early LCDs looked like washed out crap with non-existing black levels and pain-inducing flickering. The reason is that during daylight CRTs were not able to produce bright enough image to win against the power of the sun, so the whole image became a grey mess. It wasn't an issue during the nighttime, but let's be honest as a kid you could not use a PC that late, or even be awaken and most people work during daylight hours. So the issue of crappy image quality was prevalent in circumstances when CRTs were meant to be used.
12:42 More vibrant colors and better black levels are contingent on having absolutely ZERO glare in your room with a CRT. Any ambient light and an LCD will win. This is coming from a guy who uses both currently.
It's not just glare. Look at a CRT and an flatscreen with the lights on and the screen off. The flatscreen is jet black while the CRT is dark gray at best. Those colors are the darkest their blacks can be under those lighting conditions. And not all of us like having to watch or play in complete darkness.
Yeah I hate playing in complete darkness... fairly certain that's bad for your eyes too. Also, @stevethepocket, on a totally black image you're correct. However, the ANSI contrast (a more accurate measurement) is roughly the same between an LCD and CRT.
@@GAMERSLAYER-o4j LCDs have terrible color gamut. It lacks that CRT or QD OLED depth. I don't believe those measurements are correct, night&day difference between screen technologies simply by using your eyes.
The bad part is that people like us whom like 0 lag and blur are labeled as picky and fussy. The very first time I used an LED TV I felt the horrible lag. Unfortunately in the past my family forced me to get rid of my CRT monitor because of space, it was an LG Flatron EZ 17" (t730sh), I feel sad about it but thinking over and over will only make me feel worse. I will eventually buy another one. I hope that the industry will still manufacture some CRT Monitors and TVs as retro is slowly growing popular, the problem is that what dictates the trends is the mass and most don't care about purity of gaming experience. Hope never dies, I keep positive.
I wish I lived in an alternate reality where SED/FED tech took over LCD in 2005 and then the tiny electron emitters mimicked CRT strobing to lower motion blur.
The best thing about crts is the thing you can't explain to someone who hasn't experienced it: the immersion in content. It's almost like vr when compared to lcd
And you definitely! Cannot explain to people who weren't visually trained to recognize frames at 1/20th of a second. At 0.005 seconds was where I began to notice anti-gravity effects on a broad scale... my goodness - outerspace not having gravity means that light goes both directions so fast that it alters gravity itself... that is what the physics engine realspace2 produced into the world.... the theory became a reality 100%.
The minimum acceptable refresh rate on a 17" CRT was 70Hz. Anything less had obvious flicker. LCDs don't flicker, because they don't rely on phosphor persistence. Then there is the much lower resolution and smaller screen size of any kind of affordable CRT, combined with the huge amount of space they consume. The blurriness of a CRT is of course an advantage when playing ancient consoles, but you can just use an emulator instead with an LCD.
The biggest drawback of CRT monitors is IMHO eye strain. I still remember having headaches and sore eyes after spending a late night in front of the PC during CRT days. Having to deal with image geometry and soft corners was also a bit of a pain with CRTs.
They should. Update the technology. Samsung and Sony back in the 2000s were working on new CRT tech that made the tube just a few inches thick. Would've been awesome.
CRT monitors are not bad and they are not ugly. They have personality and they are robust. People say they are fragile but they are not, you just have to be careful not to drop them like you have to be careful not to drop your flat monitors. Honestly I am wondering if I should invest in a nice CRT Monitor because I rarely play any FPS games.
Just get any old 1280x1020 @85 17 or 19 inch of any brand, should be cheap. I have a 1280x1020 Dell M990 from 2000 and it's great. These resolution CRT's are amazing and give you the flexibility of higher def at higher refresh rate top limit while also giving insane refresh rates at lower resolutions. Plus anything bigger than 19inch imo is too big and heavy unless you have the space
Same here, mine is a CTX EX701F which just a few days ago stopped going into the menus, reset all the picture settings to defaults, and shows up on my PC as a "LXH-GJ769IIF" which makes me think some sort of firmware corruption. I don't want to get rid of the CRT but I might have to replace it.
DLSS and FSR do wonders with the frame rate for gpu limited scenarios. But yeah, I wish I could get a good 1080i crt, but those are so expensive… I wish someone started to produce crts again. And I mean they are freaking particle accelerators which is awesome.
12:04 Don't forget CRT color convergence never being perfect. And a big benefit of LCDs is the backlight can have no flicker (good for static images, not great for motion).
I made a pong… on lcd I got square for a ball, on crt it was round . Love CRTs blurriness for games. LCD good for text and fine line tech drawings… CRT for visuals. Beautiful colors and natural looks. This is why I still sport my CRT and will never get rid of it.
@@GTXDash You didn't like my comment? I'm not referring to the comment I'm making now or the other one I made that says Hello, maybe you didn't see it, I made a comment to you, I hope you see it, it's not hate
if you ever want to use these i suggest to get a sunglasses and an eye drop to use every 5 minutes because i remember these was painful to look at for more than 1 hour
16:18 My Syncmaster 955DF is a 1856x1392 VGA CRT monitor, i counted twice the amount of phosphor dots for text vs pixels on my 1080p LCD, it's basically a true 4:3 1440p monitor. Not only there's no aliasing on my monitor but text is way, WAY sharper than my 1080p LCD.
The solution should have been SED/FED tech at 60-75Hz with it's individual electron emitters doing flicker naturally.........I'm really bummed they never made it to production 😢☹️
Thankfully CRT's have been outdone now with 480hz OLED beating them for quality of picture whilst having incredible motion clarity with black frame insertion or even better, native 480hz. The essentially 0ms MRPT makes all the difference at such high framerates whereas LCD's MRPT often makes the benefit beyond 240 almost none.
@@mikafoxx2717 Black frame insertion has dark image problems, and not the same as CRT line strobing. Also native 480Hz clarity also requires motion to be 480FPS. CRT only required 60FPS@60Hz or 75FPS@75Hz etc. You don't get motion clarity when your game is running at 60FPS@480Hz OLED or whatever sadly....since the issue then is no real games run at those frame rates unless you are running 90s-2000s games or super low quality setting competitive FPS.
@electricblue8196 it is the same as strobing when you actually make it strobe. There's CRT emulsion shaders too, doing that. And with 1000nit OLED panels, you can flash them bright quickly if you trick it into HDR mode or some such with software for BFI or said crt scanline Shader that blurbusters just came out with.
@@mikafoxx2717 It's interesting with the phosphor persistence emulation and strobing thing instead of 100% duty cycle(ON OFF) strobing which looks annoying and worse in software(ufo test BFI).....but I have a feeling this sort of thing must be done in the monitor's hardware and not software.
So to sum it up, you can push as many new frames to the display as you want but if it's not clearing/blanking the old frames first you've got motion blur.
Exactly. Especially if the blanking is longer than when a frame is visible. BRI is very limited in this regard. This is why backlight strobing is what display manufacturers should continue to improve on.
you point around 7 minute mark is exactly why I push for black frame insertion in modern displays. because these MF dont realize increasing refresh rate doesn't matter if you can't drive it. currently on the oled_gaming subreddit, you can find people literally buying $800-1400 OLED displays but their computer can't even drive them properly. And its just insane. I could drive a 360hz OLED playing counter strike 2 or hunt showdown or world of warcraft, but most gamers cannot. What would be superior is having an LCD/OLED that mimics a CRT in terms of pixels on vs pixels off, giving that same illusion to our eyes that give us such perfect motion clarity. If they developed an OLED that was say 240hz and included a true black frame insertion mode that kept the full refresh rate, it would be pretty sweet. As it stands, modern BFI will generally double your visual performance. 120hz + BFI looks exactly the same as native 240hz. 240hz + BFI would look like native 480hz.... and obviously having less fps, BFI helps improve. Sure, new 4k monitors are coming that are 240hz, and have a 120hz BFI mode (runs in 240hz mode, but only every other frame is displayed, so frame black frame black frame black and so on). sure, for console, that 120hz + BFI mode might be awesome, but most people already have 120hz OLED gaming TV's with BFI.... not to mention television BFI is superior because both LG and SONY (the top brands for oled tv's) use a "rolling scan" black frame insertion. and it works. hell my sony xperia 1 mark 2 uses a rolling scan BFI and its only 60hz but feels like 120hz. its amazing. watching movies/anime on my phone in bed is way nicer than watching anime/movies on my PC gaming monitor, simply because my aw3423dw 3440x1440 175hz oled doesn't have BFI. so you get extreme judder in panning shots and its clear as day in anime.... yet on my phone which is oled + BFI you can't even tell it exists.... I would kill for a proper 240hz+BFI oled gaming monitor. sadly, we aren't getting it yet. i had that shitty ips viewsonic blur busters certified 1080p display which had BFI at 60, 120, and 240hz.... 60hz bfi made my eyes hurt. it was atrocious. 120hz + bfi wasn't bad, but looked like native 240hz.... and then 240hz vs 240hz+bfi was no difference because pixel response wasn't good enough to show a difference. so technically its ;capped to 240hz with or without BFI. however OLED actually has TRUE sub 1ms pixel response, so any BFI at any level would be amazing. I really want 240hz+BFI.... true 240hz BFI. a solution to the native aliasing on crt displays for LCD/OLED? honeycomb subpixel structure. instead of a subpixel of red/green/blue we need a technology where 1 subpixel can produce the entire range of color in one subpixel. have it in a honeycomb/hexagonal setup. and since the subpixels would be so small, you end up with a sharper display as the PPI would be much much higher.
I think one of the reasons why crt got replaced very quickly in offices is because lcds have a more clear picture and is less blurry which helps with text.
Yeah when you go back to crt from playing on a flatpanel, plugging in like a Mega Drive into a CRT and pressing jump buttom feels like the character is jumping slightly before you press it. Not a joke it genuinely feels like that. Obviously impossible but its more of a feeling thing, must be something to with your brain being adjusted to lag.
Actually, power consumption differences are negligible in the favor of flat-panel displays at best. That’s only if you’re comparing square inch of display per watt, and using only their peak wattage draw, for like the extent of 2-4 hours. The problem with the comparison is that CRT’s are only at their peak power draw at the first few seconds of startup, and it’s continuous wattage draw is a significantly small fraction of that. Whereas, Flatpanels generally just reach their max power draw and stay there roughly continuously. Monochrome crts generally use like roughly a third of what color ones do even. Basically, leave em plugged in side by side one another on a power consumption monitor, and eventually a crt will probably end up consuming less power in continuous long haul usage.
Theres truth to what you say. My backroom LG 1440p 144hz monitor and I will put the wattmeter on it tomorrow to see its average. But I know for sure OLED will eat more power than a high refresh CRT. My buddies 27 inch/1440p-240hz OLED was using 55watt idle at desktop(dark background) and around 86watt and some frequent 100 watt normal figures while gaming at his settings(with 110watt during bright scenes), and 120watt on white background webpages. Idk how the mfg are cooking the numbers on the manual but thats fairly high. Whereas on my Sony G520 in the living room(21 inch CRT) 1440p at 85hz(128.5KHz tube speed) will spike to 133watt during degauss cold startup and then 87watt average gaming and 96watts in bright scenes, with the occasional 101watt spike with a almost full white background website page. Next to it(dual CRT setup for convenience lol) the Fujitsu/Siemens 21inch(shadow-mask) at 1200p at 90hz(112KHz tube speed) spikes to 126watts degauss cold start, then averages between 80-91watts depending on the scene gaming, and 104watts mostly white screen internet page. Before I got the Siemens, was given a Sony HMD-A100(15 inch) by the same OLED buddy. It looks spectacular after a 10 minute warmup at 768p 85hz. It used a very low amount of power and surprised me. I put it in the shop as the music/info terminal. Lastly my Samsung 997DF(19 inch) 1200p at 72hz is oddly close to the Sony 520 in power consumption but looks great. I put it in a vented box with a big moisture eater bag from work in there. I check the bag and bake it every 7 months to keep the Samsung in good shape for emergency backup. Sadly I dont have data on the last 2 CRT monitors saved on my computer, but I did test them when the wattmeter arrived. Sorry for the long text but its rare to see someone who has modern experience with older display tech. I daily the CRT monitors(typing this comment out on the big Sony) because its vastly superior to my not so old 32inch 1440p LCD in every aspect besides screen size. Thats why I pick up 17inch+ roadside/dump tube monitors every chance available and test them, LCD seems terrible in comparison and I like having backups. Would buy an OLED but its just too expensive.
@@insurgentlowcash7564 Naw, no worries. That was insightful and kind of illustrates basically what I was getting at, with data, however anecdotal it is. Reading is no difficulty to me. I read and write effortlessly, by fortune of being effectively literate. 😄
60hz flicker looks awful on a CRT, it was the bane of many a worker's existence if they were staring at it all day for work. 60hz on an LCD doesn't flicker the same way at all and modern zero flicker backlights are much nicer as well.
Pioneer and Panasonic plasmas are very close to CRT when it comes to motion clarity. I owned a Panny plasma for 12 years and gaming on my consoles was always a smooth experience. Keep in mind that the vast majority of these games were running at 30 FPS. Eventually my plasma burned out and I replaced it with an LG C2 OLED and I couldn't believe how bad the motion was. 30 FPS was simply unplayable because of the stutter, and even 60 FPS looked very blurry. And it wasn't just the motion clarity. Somehow the colors looked more natural on the plasma, and that was "only" in SDR. And then there's banding and uniformity. Both are still superior on a plasma. Anyway, I sold my C2 and went back to plasma. I bought an excellent refurbished Kuro and it's literally the best picture I've ever seen. I don't care if it's "only" SDR and 1080p. The motion clarity and natural colors are simply stunning.
Someone where i live is trying to sell a 720p pioneer kuro.... its tempting but not sure if it can fit in my room. Id mainly use it just to game on 360, ps3 and watch movies via blu ray. My panasonic crt is decent but 480i not 480p like later hd models. It has component and overall, I love it. It was free from a church and given i have low contrast and just a tiny bit higher brightness than factory id say its very low hours.
@@Stoddardian pdp-1130HD. Should be a 50". Has the box, and display. Speakers I can get on ebay for 20 bucks. I'm mostly concerned about hours on it. Ironically I have a cheaper pioneer set but the power board is nla and it'd cost more to get the board than a whole tv...lol.
@@roveradventures If it's cheap I would go for it. I got really lucky with mine. It was refurbished by a guy who worked as a Pioneer technician for 15 years. It still had over 90% of its original brightness left. It's an LX6090.
@@Stoddardian indeed, it was 120. But if I go for it I'll try to drop the price since speakers are about $20 on ebay. Ideally they didn't use max contrast or brightness. The panasonic factory settings or the settings I saw. "Picture" is maxed out usually which can really decrease crt life.
I modified an LCD to strobe the backlight down to 1-2 ms (adjustable) and it made it so much more like a CRT with very clear moving objects, though it was a lot dimmer.
That's probably the only way we can achieve CRT motion clarity if manufactures try focusing on that rather than building something that goes higher than 500Hz.
The main 'latency' of an CRT is the time the phosphor needs to illuminate, so there is nothing that has no latency, but it's pretty much instant compared to any kind of digital image transfer.
Great video! I just honestly wouldn't downplay how great BFI modes are in a select few recent oled tvs, especially the LG CX/C1 and a few PC monitors tuned by Blurbuster. They can absolutely reach or surpass CRT's motion clarity. an LGC1 at 120hz with BFI maxed is incredible for videogames! It took basically 2 decades but we're getting there, finally.
LCD/LED has always been inferior but could you imagine how deep a 100” crt would need to be. I still have a CRT monitor for retro gaming, but you need a deep desk, modern monitors take up so little room.
If manufacturers continued building CRTs, they would've improved on the technology the same that LCDs continued to improve. Samsung (before canceling it) got a 30" CRT to be only a few inches thick. That being said, I think the industry needs to keep working on OLED technology to get to the reduction of motion blur that CRTs were able to achieve.
@@GTXDash Can't wait for multi-stack RGB-OLED with native HDR-RBFI, which will tide us over until eQD which will hopefully have a native rolling-scan modulation method.
Excellent points all around MXDash2! Thoroughly enjoyed how you put all this together and you're spot on for every point! I spent a pretty high fortune for my whole PC setup, and when it comes especially in running my older games I miss my CRT that was 1600x1200p. OLED at the moment gives me most everything I want and it's impossible for me to want to go back, but I haven't forgotten the perks of the old CRT. I actually back in the 90s ran my TV and VCR through my GPU because everything looked so much better on the monitor. Somewhere around 2005 I think was when I stopped using my Radeon AIW 9800XT (If I remember correctly) where I had all that set up and everything just felt great. These days it's amazing when it all works, but I'm plagued with games that don't work with windows, or multi-core CPUs are the issue, or graphic drivers obsolete.... A decent number just flat out don't work, and even more recent titles don't.
Believe me when I say this: I love CRTs, I have a 29 inch trini, rgb moded, hooked to an RGB-Pi arcade setup (OS4). CRTs are great for retro gaming. That said, CRTs, generally speaking, are NOT better. It's just more cons than pros at this point. I just pray for my trini to last many more years. Even after doing a recap, I know it's not forever (it's already 15 years old).
11:11 I own both a CRT display @60 Hertz(It's a Sanyo VM4209) and a gaming monitor(BenQ EX2780Q @144 hertz) and I can affirm your claim. They both feel about as smooth to use despite being made about 45 years apart.
These are arguably Mitsubishi's best tubes they ever produced, better quality as far as fidelity and IQ than the slightly newer 2070SB tube, much like Sony's GDM-5002PT9 being better than the GDM-F520, despite being older, it's the phosphor quality and electronics being slight better in the slightly older models for some reason.
I love CRT tvs. I still have my "40inch" Toshiba CRT. Play my PS4 on it. The image clarity, sound (bass/treble/theator quality surround sound), color depth, contrast, etc are amazing to me. One of the many things I love about my CRT (and it's a major plus) is that the image & sound quality have not decreased at all over the years. Nor has the image started to look washed out/overlit. Every HD tv I've had (top brands too) have always lost image & sound quality as they start to age & always end up looking washed out. My CRT Toshiba still looks & sounds just as great as the day I bought it. Great video!
bro... not saying i dont like crts but i m not sure if i can take a person serious that talks sound from tv speakers... what about you just get an actual amplifier and speakers and use those instead?
If you are ok with CRTs or Backlight strobing on modern displays, i honestly envy you. Here is my story. I used CRTs in school and university and i had headache after some time. Bigger and flat ones were better for me, but still painful after an hour. Some of them ran at 100hz at 768p, also not helping much. Not happening on TV, but i viewed them from quire far. Never had that on TN, IPS, OLED. Friends VA also was okay, but I didn't had them personally. On another hand, my cheap 144hz TN has noticeable motion blur. If i use Backlight strobing it does reduce blur significantly, almost to none. But my eyes hurt after a few minutes. I guess I'm sensitive to that thing.
I have a CRT monitor I got free from someone and I love it. It has downsides like, yeah, size and danger if dropped or if I ever need to repair to, but it works better for me. I game at 1024x768 at 118Hz (the best I can get out of it), and it actually works *very* well, I oftentimes forget that it's not 1080p. As long as I have antialiasing enabled, at least. I did have to scrub the damaged anti-glare coating off it because otherwise it was awful, so I can't really use it without having the curtains closed in my room. I will not be getting any new LCD or any flat panel monitor until I can get something that is as good as a CRT in picture quality and feel. And I do wish I could exchange resolution for refresh rate on modern monitors.
I used a 17" CRT monitor next to a 22" LCD a few months back. The CRT would often develop a painful high pitched squeel. The CRT was dimmer & took a few seconds to reach peak brightness. The CRT's text was blurry. The CRT's image needed tweeking a bunch of settings to be mostly square and centered--and changing the resolution meant all those settings went wonky. I didn't really care about the weight and bulk when I later replaced the CRT with another LED monitor.
I"m still using my CRT for playing games, sometimes i use the LED ones for some games and aplications such as Adobe Animate, Blender, Premiere but my main playing monitor is the CRT one, Motion Clarity even at 60 hertz is another level, there is no modern scream that come even close to what CRTs can make, there's no input lag, colors are great, natural bloom, black levels.. CRT is the king.
The biggest issue with crts and no one seems to talk about they are old and will require servicing. A lot of the parts like flyback transformers are unobtanium. High end CRTs in good working condition command high prices and haven't been serviced, so you are taking a bit of a gamble. I own many crts and I have yet to open up a single one without it needing significant work or servicing. Out of spec leaking capacitors is almost guaranteed at this point.
I have 3 crt monitors that i found on the side of the road for trash pickup and they all work perfect with no issues. I found a samsung sync Master last week and the picture is insane. So freaking amazing
there is nothing about a late enough made model that you cannot repair with easy found parts right now. You might need to consult a repair show about how this replaces that, but its doable.
I hope so. If it's a high refresh rate display where 5 out of 6 frames are black while the 1 is bright enough to compensate for the 5x darkening caused by those 5 frames. Sounds outrageous but that's really what it's gonna take to reach that level of no motion blur that CRTs are known for.
Standard BFI, that is straight impulse modulation can not produce the same type of motion IQ and performance that raster-scan CRTs can, so for BFI to make CRTs obsolete, they will need to come up with an effective "race the beam" algorithm to simulate raster-scan, this will also need to simulate the incredibly fast phosphor decay times and phosphor glow and all that good stuff that CRTs have, the hardest thing to match will be the CRTs incredible native image depth, which with a good quality input source can be almost three-dimensional, looking glass, aka light-field tech is a good candidate.
i hated 60hz so much on a lcd i would play splatoon 3 on my crt, was able to get it to do 720p, but it died shortly after :(. Was cool af watching it die tho started getting all wavey turned red and slowly started to fade to the center till there was nothing.
I might have missed it but other advantage crt has over new monitors is that, because of the lower resolution you can choose, games at that resolution benefits in performance but at the same time they look better at that resolution in crt than a newer monitor because of how crt works. The only downside of them are maybe HUD elements that are programmed with higher res in mind being out of proportion
OLED's geting close enough to the point its hard to even see motion blur \ ghosting. "1ms" high refresh rate TN panels suck balls by comparison and they used to be king for LCD ghostless gaming, but their nothing compared to the current top gaming oleds. It just took nearly 30 years to get LCD tech to be good enough to equal a CRT. I personally don't care about going back. And I am one that lived through the CRT era. My Last CRT was NEC could do insane refresh rates up to 1024x768 , and could do max VGA spec at 2048x1536 @ 60hz not that that mattered since many vcards din't have enough vram or their ramdacs at that time struggled to go that high with out looking like crap. I have zero desire to go back to that era I have no nostilga for scanlines. Old CRT's also blur as they get older from constant use assuming you don't have other issues like purple tinted screens as the caps fail.
Regarding blurring issue. You can actually tune them to be perfectly crispy again. They have adjusting screws/knobs, accessible when you remove the case, which allow physical tuning well beyond what's achievable through menus. I picked up multiple CRTs that were written off as "blurry", tuned them up and used/sold them. Never even encountered one that was blurry beyond saving tbh.
@@Skrenja CRTs literally had between 70 and 90 years of development and they are still only marginally better than the best OLEDs we currently have. There is simply only so much you can do with an electron gun and some phosphor.
You missed to mention something you probably know very well. And that is the fact your eyes start to hurt after 1-2 hours on an CRT. This reason alone is enough for me to never even attempt to game on such an eye burning mess again.
@@Kourku Continuous exposure to CRT Monitors exposes you to between 0.05 - 0.60mSv/y, so if you had a CRT monitor constantly powered on next to you you it would be equivalent to getting a chest x-ray done every 20 years or so
I got an Asus 27" 1440p woled and an Acer 390hz ips monitor and with them I am still mid in say warzone and bf2042, but with a crappy 17" samtron crt I am always on the top charts and even being called a cheater. Explain that :P a cheap CRT even at 85hz with zero inputlag is the gaming king...
i used to play with a dual seemless monitor setup of 320hz each and for like a month of playing CSGO every single day for hours, i could barely kill one or two people a day. a week ago i bought a chinese 1990s completely trashed but working 12' crt monitor and ihave been on the top 10 players IN THE WORLD this past 5 days. thx crt
As some have mentioned, the LG CX and C1 are OLED stand outs so far as they actually employ rolling scan to achieve a CRT like effect yielding an effective 300 Hz or more of motion clarity. This is possible, because with scanning the display as opposed to inserting black frames, they're not limited by the maximum refresh rate. (120 Hz rolling scan is the same on both models. However, on the C1 the effect at 60 Hz was greatly diluted following complaints of flicker on the CX at 60 Hz.) 300+ Hz is still well short of the 1000 Hz needed to approximate CRT. However, I think returns are likely diminishing as even at 300+ Hz equivalent the moving image is already starting to resolve in a really dramatic way versus 60 Hz or 120 Hz without rolling scan, which are both still quite a blur in motion. So being a CRT enthusiast, these two OLEDs are the ones I went for. They are getting hard to find now, but some are hopefully still out there. (E.g., might get lucky on eBay, Amazon returns...) As to complaints regarding CRT, please note that a whole range of these devices existed from extreme budget to ultra premium. That combined with folks very unfortunate habit of leaving them set to the 60 Hz Windows default was bound to leave some with unpleasant experiences. However, I run mine at 100 Hz. And they still present gorgeous image quality.
Wait, what? A CX and C1 don't use BFI like other OLEDs? I'm confused. Will their BFI make 30fps look smooth like a plasma? Because the 60Hz BFI on my C2 looks awful at 30fps. Meanwhile on my plasma 30fps looks fantastic in motion.
The CX and C1 are different. If you're actually inserting full black frames, you're limited by the maximum refresh rate. For example, a 240 Hz display is limited to 120 Hz BFI. The CX and C1 actually contain a piece of additional hardware independent of the screen's max refresh to scan the display instead. They are thus not bound by how many full frames it can do a second. This allows those two models to achieve in excess of an effective 300 Hz or more in motion. Sorry, I'm not familiar with how a 30 Hz signal is processed by any of these displays. Though I'd guess interpolation or repetition with 30 dividing evenly into 60 or 120. And then with the plasma you have the advantage regarding how it's driven giving an effective motion resolution of 200+ Hz I read. Whereas the C2 is being limited to 120 Hz, because the special hardware is gone. It's thus limited to inserting actual black frames I think.
The fact they nerfed that is insane to me. Sample-and-hold displays are infamous for their poor motion clarity and then they nerf the one thing that could improve it? Why? Seriously, what you're saying sounds almost like plasma motion clarity. If I understand you correctly, the CX and C1 actually had a rolling scan like a CRT?
@@Stoddardian Yeah. More mimicking CRT with a rolling scan pattern as opposed to inserting actual black frames. Better than plasma motion clarity as I understand it, but still less than CRT clarity. (Around 3.2ms image persistence with this rolling scan on the OLED versus 1ms on a CRT. OLED pixels aren't as bright as CRT phosphors, which constrains, but still a pretty spectacular result.) Why they did it? Apparently cost saving by removing the hardware and associated costs of further development in the face of Samsung re-emerging an OLED competitor challenging LG on the basis of brightness. I think that's what I've basically heard and could make of it.
I would love to see it in action with console games running at a locked 30 FPS. If it works then the decision to remove it is pure insanity from a picture quality perspective. I mean, what on Earth were they thinking? As far as plasma motion clarity is concerned, the best plasmas actually had a persistence of less than 2 ms. Anyway, I hope they bring the rolling scan back eventually.
Nothing beats a crt! And I have an oled c1, a crt and a plasma so I can say it because I am always comparing... specially the movement of cameras that are very noticed on oled/lcd but not at all on a crt...
It is funny to talk about motion quality of displays, when game developers still fok up their camera movement to an unnatural jerky headache making shock experience. I've yet to see the first ever FPS that has a camera movement that does justice to anything above 24fps.
Oh God do I just not enjoy 1st person. Immsim genre is fine. But otherwise 3rd person just feels better. Like, no one plays Warframe and thinks "wow this motion sucks it's so dizzying"
The PS3 on my widescreen 480p crt at 30fps plays smoother than many PS5 games on my Oled. Not even kidding, just try Resistance, amazingly smooth 30fps.
I forgot the name of the tech... it was supposed to replace CRT with a pixel grid structure like LCD, but, each pixel was a very small cathode ray tube like pixel... that would've been a good replacement... I think. Maybe, though, the motion blur would still be a problem with that as well.
@@ShankMods Makes me sad every time I think about SED & FED, even Pioneer's Kuro tech got shut down, seems like quailty displays are only destined for the professional grade markets.
I would love to see a new 1440p CRT at 200hz. I'd buy it
Won’t happen, unfortunately. It’s like going to the moon, we don’t know how to make CRTs anymore. The assembly lines don’t exist anymore.
@@WalnutOWnah we know how to make them. Not as if the over century worth of R&D just up and disappeared. The community behind CRT's would somehow have to convince a company (like Sony or Samsung) it would be worth their time to re-invest in production.
One of my goals it to make a 4k 200hz crt. Would require a ton of curcuit design work but I think its possible. As far as I know, most crt monitors could display higher resolutions, but the circuitry inside driveing the electron gun wasn't designed for the inputs
Same
The dream monitor.
I am crying right now. My dad used to have a huge ass Sony CRT TV in the basement which he threw out a week ago. Now that I watched this video, I found out that it was the Sony KV-40XBR700 which is one of the Holy Grail's of CRTs.
Damn, man. Sorry to hear that. One of the best for playing retro consoles.
I know what it feels like because my dad threw away a ViewSonic desktop CRT like 5 years ago. I feel sick every time I think about it.
RIP for the Trinitron🫡😔
40inch 300 pound Godzilla Trinitron RIP🫡🥲🫡
Don't be too upset. The KV-40XBR isn't that great. Anything that's not 1080i is scaled and adds quite a bit of lag, and doesn't look that great. Basically all 1080i content is 16:9, and the 40xbr is 4:3. So you essentially are guaranteed to either have lag, or letterboxing. It is by no means at all a holy Grail, and honestly is pretty awful for gaming in general unless you are going deep down the scaler rabbit hole.
@@ShankMods yeah you gotta find out how to put it in 540p mode in the service menu.
The difficult part isn’t to believe the superiority of crt… the difficult part is to go to a store and buy one. Not to mention find a shop to service an old one. The last shop that I knew that they did it… has been closed for a couple of years.😢
Getting your hands on a working CRT is difficult but the very next video I'm currently working on covers some of the most effective ways of getting a CRT. I didn't mention it in this video because it really was beyond the scale of what I wanted to focus on. But expect a video soon that covers this.
@@GTXDashI don't think it's hard to get a CRT, it's hard to find a high quality CRT. Ideally you'd want something capable of pushing 1080P+ and as of right now you'll pay hundreds to get ahold of one
@PJxBuchwild Yep. I don't have a CRT that can do 1080p. However, I wouldn't call them "low quality".
They are inferior, dude
Tell me about it! I've had a Street Fighter 2 cabinet with a vertically collapsed screen for years, and i just cannot find anyone to service it.
Decent summary, but a couple things:
1. CRTs do have a native pixel resolution, so to speak. The dot pitch of a CRT determines how sharp the pixels will map to the phosphors. You divide the height & width in millimeters by the diagonal dot pitch, then divide those numbers by 1.25 because ideally, 1 pixel maps to 1.25 phosphors. Past that, you get the natural anti-aliasing as you described in the video.
2. CRTs do have input lag. A CRT at 60Hz will have 8.335ms delay in the center of the screen and 16.67ms at the very bottom due to how they draw scanlines. By comparison, some of the highest-rated flat panels on RTings measured at 9ms. That's only a difference of 1ms, which is less lag than the difference between certain USB gamepads, keyboards, or mice.
3. LCD bilinear filtering can mostly be avoided by using integer scaling. 240p, 720p, and 1080p all cleanly scale up to 2160p. 480p unfortunately does not, but the black bars with letterboxed 1920p are no more distracting than watching 16:9 content on a 4:3 CRT.
I'd love to see someone make a more modern crt display with modern ports and better brightness. I'd love a display like that over my OLED
Great video. To my understanding, the reason CRT computer monitors have more visible flicker at 60hz than their television counterparts is due to the choise of phosphors. Shorter persistence phosphors dont leave as long trails, but result in more visible flicker.
Sub pixel light scatter is something no other technology has been able to recreate... there is something so realistic and true to the light that emits from the phosphorous grid... skin looks like skin in a way that has depth, unlike oled or even led. It's like comparing the light of an incandescent light bulb, with an led bulb that runs the light at the same colour... one fills you with warmth (yes, physically also, but... not my point), and the other just feels cold, even though the colour of the light is warm. Incandescent light bulbs emit more light in the spectrum... making it also healthier for us, and... I wonder if light from a cathode ray tube also has that...
Check out plasma TVs, especially the 9.5g KURO Pioneers. They also use phosphors.
You're referring to the natural incandescent nature of CRTs, which is incredible, but you are mistaken on it being exclusive to CRTs, the last few generations of Plasmas exhibited this quality quite strongly, no as much as CRTs, but especially the last two generation of Pioneer Kuro Plasma displays had beautiful incandescence.
Ngl it looks way less realistic compared to other monitors and can be added easily with a filter. Also it makes reading text extremely difficult
My dad was repairing CRT TV's and monitors, so I see many different models and pictures. When LCD start taking over, I never understand why people were changing from CRT to LCD. Picture quality was so much worse but people just did not care. Thanks to my dad, I have 4 CRT Tv's saved.
i’m using CRT monitors since 2011 when i discovered hertz. i have played modern titles on ultra preset on my gtx 980ti. i play competitive games too.
the research this guy has done is 100% accurate. i am well experienced now about new modern displays and CRT monitors. i use my sony 20inch CRT monitor than my 2k 160hz Lg ultra gear.
I hate that they didn't keep even one factory running...WE GET FREAKING TN PANELS that are basically e-waste, but not CRTs which are basically required hardware for retro gaming and TV
@hunn20004
They still make tube over seas.
There may not be CRT repair techs per say, but specialized video game and pinball repair shops will often re tube them and do the alignment/calibration work. It won't be cheap, but the local pinball shop I work at has started doing them due to increased demand, as we are already used to working around high voltage electronics.
I didn't stop using crts because of the size I stopped because they were headache inducing and super washed out and hard to even see unless the lights were off. I had to spend 8 hours a day working on a crt for 10 years and the introduction of lcd monitors was the best thing ever for me. For gaming the only thing I ever cared about was input lag, but that was only a problem like 10 years ago it isn't a thing anymore.
Yeah. That's why we only use CRTs for specific things. other stuff, especially productivity, you need a flat panel.
Everyone should really see and play with CRT to realize that CRT is a treasure. The reason why classics are classics is that they cannot be surpassed.
I think the answer is in gray to gray ghosting. LCD is ghosting because of the time it takes to change a pixel from one color to another. Sometimes the time it take a pixel to change into whatever the next frame requires are longer than the time it takes for yet another frame instruction to arrive.
This is true. But even on fast flat panels that only take a few milliseconds for a frame to change still suffers immensely from motion blur.
@@GTXDash What I was thinking, is that there might actually be a much larger delay than a few milliseconds per frame then what we test. For one, a detector might see the time between one moment and when there has been enough change for it to detect as another frame, but the pixels haven't reach the desired color yet. This might not in itself be as big of a problem if the being recreated is only 23.97 fps since the the time between ec frame is so long that it will reach the correct color, but trying to pump 300 fps of fast moving pictures through a LCD panel, one could imagine the picture could get a little slurry. But ultimately by seeing this transition taking place before your eyes instead of only seeing a crisp image stropping, could also add to the blurry effect?
CRT pictures don't have the problem of particle filters that physically have to change alignment to block light, it will only have the aspect of the rise and decay of the fluorescent light when it's activated, which I think only translate to overall perceived image brightness?
There is also the whole CRTs being additive light and LCDs being subtractive light which add to the magic experience of our beloved CRTs.
Its that gray area my dad alwaya said....
I wonder how awesome crt monitors would be today if they were still in development today. It would be awesome if there were brands that would take up manufacturing crt monitors again and put time into developing thinner and lighter than they used to be would be cool if it was possible.
It's a nice thought, but the fundamental way in which a CRT works means there's a limit to how thin and light you can make one. Keep in mind that the picture is drawn by a particle beam fired from a single emitter at the back of the tube. It's basically the center point of a giant sphere, with the screen being the outside surface of said sphere. The thinner you make it (that is, the smaller the radius of the sphere), the smaller and more curved the screen has to be to get a legible picture. All this has to be done while maintaining a vacuum inside the tube, and I'm pretty sure the bulk of a CRT's weight back in the day already came from needing to use materials strong enough to do that.
Could we improve them with the technology we have now? Probably a little, but I imagine part of the reason we stopped developing CRTs is because we were already seeing diminishing returns.
At the very end, they finally found a way to emit particule beams without the need for a tube which was about to lead to actual flat panels like LCDs.
I remember when we all started to transition to LCDs from CRTs back in the early 2000's. Sure LCD's were new, sleek, and opened up your whole desk, but it was generally accepted that they looked way worse. The biggest issues were the worse black levels and the motion blur. There really wasn't much of a competition though, the heft of a CRT was just untenable. We all just stopped recognizing that CRT's even existed.
What if you had a CRT with the electron gun where the stand for a LCD monitor would be, and reflected the image with a mirror?
crts are not uggly. They are very charming
Awesome video! I only have a couple things to add:
Reguarding your comment that CRTs have next to no latency, the latency to the center of the screen on a CRT is affected by the scanout behavior. Since a CRT scans the image line by line from top to bottom, the latency to the center of the screen would be approximately half of the frame time. At 60Hz, the frame time is 16.67 milliseconds (1000ms / 60Hz). Therefore, the latency to the center of the screen would be around half of that, approximately 8.33 milliseconds. This accounts for the time it takes the electron beam to reach the center of the screen during the scanning process. This is not much faster than most gaming monitors at 60hz if also measuring to the center of the screen so this is not really an advantage of CRTs. And 360hz+ gaming monitors will still have lower latency than basically all CRTs.
I still think that impulsed display technology (CRT or LCD/OLED BFI) is a bandage for truly 'retina' motion. I agree with your point that it is impossible to run most modern games at extremely fast refresh rates natively. But for motion to look realistic, the gold standard should be native 1000Hz+ without any strobing. I don't think this is unrealistic to achieve in the near future either. Lossless Scaling already supports 3x AI frame generation in any game to make this easier. Just because 60Hz has been around since Atari does not suggest that the gaming industry is not undergoing a paradigm shift towards higher refresh rates in recent years. We already have 480Hz OLEDs, which is already 50% of the way to retina motion clarity. I think that with the help of AI frame generation and other technologies, the future will be 1000Hz+ OLED/micro-LED displays without any BFI or flickering, and this is probably not as far away as many people think. BFI and CRTs are just bandage solutions to make lower framerates look good, and they inherently cannot fix temporal aliasing (wagon wheel effect). Only true high-Hz can fix that.
This guy knows tech. holy smokes lol
@@ethanwright752unlike the guy making the video who thinks that people buy 360+ hz monitors to play triple A games at high res 😂😂😂
Lossless scaling is expensive and degrades image quality, if not stuttering all the time. I rather use 120 hz backlight strobing.
@@normaalewoon6740you can play a game with native dlss support which will have much less image quality loss
The guy literally said “a few milliseconds of lag is basically negligible” when most modern flat screens can do 0.1 microseconds of gtg response time
The amount of hate in the comments is a bit sad. CRTs totally rock!
120Hz BFI on an OLED was the closest we get to good motion resolution. Guess what, they dumped it...
Wait... what? Who did? The manufacturers?
@@GTXDash LG C1/G1 are the latest OLEDS that support 120hz BFI; even then, they have a (slightly) lower motion resolution than the CX and GX models. Following models (C2, G2, C3, G3 and etc) have BFI but it does not work in 120hz mode.
No other major OLED tv manufacturers have good BFI modes. Aside from some oled BVMs, that's as good as it gets.
@@Heymisterbadguy The retrotink 4K has full-fat hardware accelerated 120Hz & 240Hz BFI with HDR injection, so there is no brightness/luminance loss when using BFI, it is the best BFI algorithm to date, except for Sony's RBFI in their BVM OLED monitors, as you already stated, which is able to get smooth motion @ just 60Hz.
@@Wobble2007 well it's BFI is really really nice but i dont know if it can reach the motion resolution of the C1/G1, which is not just double motion resolution (like the 4k would be since it caps at 240hz)
Also, the retrotink in 240hz caps at 1080p
@@Wobble2007 when I mentioned 120hz bfi, i meant bfi for 120hz sources
I remember when LCD monitors started to become widespread in offices in the early 2000s. Boy were they shitty. Grainy, blurry and un responsive. They only made the beancounters happy, due to less energy consumption (also for the air con) and more real-estate on the desk.
Well they were much better for the health of the workers
One thing that this video doesn't mention about disadvantages of CRTs is their power comsumption and influence of "environment" on the image quality. I remember from my childhood that CRT in comparison with even early LCDs looked like washed out crap with non-existing black levels and pain-inducing flickering. The reason is that during daylight CRTs were not able to produce bright enough image to win against the power of the sun, so the whole image became a grey mess. It wasn't an issue during the nighttime, but let's be honest as a kid you could not use a PC that late, or even be awaken and most people work during daylight hours. So the issue of crappy image quality was prevalent in circumstances when CRTs were meant to be used.
Phosphors. After glow. CRTs use very different technology. Games look spectacular on CRT
Plasmas used phosphors too, and after CRT, were the best for motion clarity and natural colors.
12:42
More vibrant colors and better black levels are contingent on having absolutely ZERO glare in your room with a CRT. Any ambient light and an LCD will win. This is coming from a guy who uses both currently.
LCDs struggle with shadow detail and warm colors.
CRT is the same as a QD OLED, only dimmer.
It's not just glare. Look at a CRT and an flatscreen with the lights on and the screen off. The flatscreen is jet black while the CRT is dark gray at best. Those colors are the darkest their blacks can be under those lighting conditions. And not all of us like having to watch or play in complete darkness.
@@stevethepocket CRT is not dark grey, simply isn't that bright.
LCDs lack any dynamic range whatsoever.
Yeah I hate playing in complete darkness... fairly certain that's bad for your eyes too. Also, @stevethepocket, on a totally black image you're correct. However, the ANSI contrast (a more accurate measurement) is roughly the same between an LCD and CRT.
@@GAMERSLAYER-o4j LCDs have terrible color gamut. It lacks that CRT or QD OLED depth. I don't believe those measurements are correct, night&day difference between screen technologies simply by using your eyes.
The bad part is that people like us whom like 0 lag and blur are labeled as picky and fussy. The very first time I used an LED TV I felt the horrible lag.
Unfortunately in the past my family forced me to get rid of my CRT monitor because of space, it was an LG Flatron EZ 17" (t730sh), I feel sad about it but thinking over and over will only make me feel worse. I will eventually buy another one.
I hope that the industry will still manufacture some CRT Monitors and TVs as retro is slowly growing popular, the problem is that what dictates the trends is the mass and most don't care about purity of gaming experience. Hope never dies, I keep positive.
I wish I lived in an alternate reality where SED/FED tech took over LCD in 2005 and then the tiny electron emitters mimicked CRT strobing to lower motion blur.
The best thing about crts is the thing you can't explain to someone who hasn't experienced it: the immersion in content. It's almost like vr when compared to lcd
It's the brain being saturated with information 1 pixel at a time by 1 electron beam drawing 1 pixel at a time on a screen inside a vacuum tube.
And you definitely! Cannot explain to people who weren't visually trained to recognize frames at 1/20th of a second.
At 0.005 seconds was where I began to notice anti-gravity effects on a broad scale... my goodness - outerspace not having gravity means that light goes both directions so fast that it alters gravity itself... that is what the physics engine realspace2 produced into the world.... the theory became a reality 100%.
Edited the dat file on my monitor to run at 240hz 480p and the smoothness is insane. I wish CRT tech kept being improved.
The minimum acceptable refresh rate on a 17" CRT was 70Hz. Anything less had obvious flicker. LCDs don't flicker, because they don't rely on phosphor persistence. Then there is the much lower resolution and smaller screen size of any kind of affordable CRT, combined with the huge amount of space they consume. The blurriness of a CRT is of course an advantage when playing ancient consoles, but you can just use an emulator instead with an LCD.
The biggest drawback of CRT monitors is IMHO eye strain. I still remember having headaches and sore eyes after spending a late night in front of the PC during CRT days. Having to deal with image geometry and soft corners was also a bit of a pain with CRTs.
did you have your crt at 85Hz? I get the same issue when I run my CRT at 60Hz, but I don't get it while at 85Hz+
@@jskilabe5986 I don't remember for sure, but I think it was around 75 Hz. With 60 Hz the flickering was unbearable even for shorter periods.
man i love watching stuff on the old tube, they should really start making CRTs again
They should. Update the technology. Samsung and Sony back in the 2000s were working on new CRT tech that made the tube just a few inches thick. Would've been awesome.
CRT monitors are not bad and they are not ugly. They have personality and they are robust. People say they are fragile but they are not, you just have to be careful not to drop them like you have to be careful not to drop your flat monitors. Honestly I am wondering if I should invest in a nice CRT Monitor because I rarely play any FPS games.
Just get any old 1280x1020 @85 17 or 19 inch of any brand, should be cheap. I have a 1280x1020 Dell M990 from 2000 and it's great.
These resolution CRT's are amazing and give you the flexibility of higher def at higher refresh rate top limit while also giving insane refresh rates at lower resolutions. Plus anything bigger than 19inch imo is too big and heavy unless you have the space
@@tommynobakaand a gtx 1080
Good job dude this is actually great. Definitely will be checking in from time to time
I used to use a CRT as a 2nd monitor on my main PC, then it died and did not feel like frying my insides just to use it longer.
@@teh_supar_hackr fry your insides?
@@GTXDash capacitors hold charge. If its not discharged and you open it, it'll discharge into you.
Same for microwaves. Be careful trying to fix them
@@ChArLie360115 Oh yeah. The CRT's anode terrifies me.
@@GTXDash Repairing a CRT to me is like a death wish I don't feel like risking.
Same here, mine is a CTX EX701F which just a few days ago stopped going into the menus, reset all the picture settings to defaults, and shows up on my PC as a "LXH-GJ769IIF" which makes me think some sort of firmware corruption. I don't want to get rid of the CRT but I might have to replace it.
I hope to fix my Sony FW900 one day, close to impossible, but one can always hope.
They were cool but my eyes never want to go back to flickering. I am perfectly happy with my 165hz IPS monitor
Finally I understood the benefit of black frame insertion, subscribed.
Such a good video my guy, ty for explaining this
Honestly the video capture of the CRT screen makes it look like complete garbage. It doesn't really deliver the point being made.
Even the OLED is not experienced only until seen in real life.
DLSS and FSR do wonders with the frame rate for gpu limited scenarios. But yeah, I wish I could get a good 1080i crt, but those are so expensive… I wish someone started to produce crts again. And I mean they are freaking particle accelerators which is awesome.
I always ask myself why I always enjoyed to watch movies on CRT TVs or monitors.
Something was right.
you were younger back then.
Life felt more wonderful in general.
Now you’re old, experienced and bored.
@@maalikserebryakov interesting perspective.
I'm just 38.
12:04 Don't forget CRT color convergence never being perfect. And a big benefit of LCDs is the backlight can have no flicker (good for static images, not great for motion).
That too, especially if you don't play games.
I made a pong… on lcd I got square for a ball, on crt it was round . Love CRTs blurriness for games. LCD good for text and fine line tech drawings… CRT for visuals. Beautiful colors and natural looks. This is why I still sport my CRT and will never get rid of it.
CRT for visuals
motion blur is just "reverb" for video games. I love it!
I also like blur but only the intended rendered motion blur that gives the cinematic look.
@@GTXDash Hello
@@GTXDash You didn't like my comment? I'm not referring to the comment I'm making now or the other one I made that says Hello, maybe you didn't see it, I made a comment to you, I hope you see it, it's not hate
I love your videos, keep in mind that I will keep up with your videos but I won't always have time.
@@AlexBernard777 Yeah, sometimes I accidently miss comments or RUclips doesn't show them right when they're posted. But thanks for the comment.
if you ever want to use these i suggest to get a sunglasses and an eye drop to use every 5 minutes because i remember these was painful to look at for more than 1 hour
It's a shame CRT monitors are no longer being produced or improved on
16:18 My Syncmaster 955DF is a 1856x1392 VGA CRT monitor, i counted twice the amount of phosphor dots for text vs pixels on my 1080p LCD, it's basically a true 4:3 1440p monitor.
Not only there's no aliasing on my monitor but text is way, WAY sharper than my 1080p LCD.
The solution should have been SED/FED tech at 60-75Hz with it's individual electron emitters doing flicker naturally.........I'm really bummed they never made it to production 😢☹️
Sad, ikr..
Thankfully CRT's have been outdone now with 480hz OLED beating them for quality of picture whilst having incredible motion clarity with black frame insertion or even better, native 480hz. The essentially 0ms MRPT makes all the difference at such high framerates whereas LCD's MRPT often makes the benefit beyond 240 almost none.
@@mikafoxx2717 Black frame insertion has dark image problems, and not the same as CRT line strobing. Also native 480Hz clarity also requires motion to be 480FPS. CRT only required 60FPS@60Hz or 75FPS@75Hz etc.
You don't get motion clarity when your game is running at 60FPS@480Hz OLED or whatever sadly....since the issue then is no real games run at those frame rates unless you are running 90s-2000s games or super low quality setting competitive FPS.
@electricblue8196 it is the same as strobing when you actually make it strobe. There's CRT emulsion shaders too, doing that. And with 1000nit OLED panels, you can flash them bright quickly if you trick it into HDR mode or some such with software for BFI or said crt scanline Shader that blurbusters just came out with.
@@mikafoxx2717 It's interesting with the phosphor persistence emulation and strobing thing instead of 100% duty cycle(ON OFF) strobing which looks annoying and worse in software(ufo test BFI).....but I have a feeling this sort of thing must be done in the monitor's hardware and not software.
So to sum it up, you can push as many new frames to the display as you want but if it's not clearing/blanking the old frames first you've got motion blur.
Exactly. Especially if the blanking is longer than when a frame is visible. BRI is very limited in this regard. This is why backlight strobing is what display manufacturers should continue to improve on.
you point around 7 minute mark is exactly why I push for black frame insertion in modern displays. because these MF dont realize increasing refresh rate doesn't matter if you can't drive it. currently on the oled_gaming subreddit, you can find people literally buying $800-1400 OLED displays but their computer can't even drive them properly. And its just insane. I could drive a 360hz OLED playing counter strike 2 or hunt showdown or world of warcraft, but most gamers cannot. What would be superior is having an LCD/OLED that mimics a CRT in terms of pixels on vs pixels off, giving that same illusion to our eyes that give us such perfect motion clarity. If they developed an OLED that was say 240hz and included a true black frame insertion mode that kept the full refresh rate, it would be pretty sweet. As it stands, modern BFI will generally double your visual performance. 120hz + BFI looks exactly the same as native 240hz. 240hz + BFI would look like native 480hz.... and obviously having less fps, BFI helps improve. Sure, new 4k monitors are coming that are 240hz, and have a 120hz BFI mode (runs in 240hz mode, but only every other frame is displayed, so frame black frame black frame black and so on). sure, for console, that 120hz + BFI mode might be awesome, but most people already have 120hz OLED gaming TV's with BFI.... not to mention television BFI is superior because both LG and SONY (the top brands for oled tv's) use a "rolling scan" black frame insertion. and it works. hell my sony xperia 1 mark 2 uses a rolling scan BFI and its only 60hz but feels like 120hz. its amazing. watching movies/anime on my phone in bed is way nicer than watching anime/movies on my PC gaming monitor, simply because my aw3423dw 3440x1440 175hz oled doesn't have BFI. so you get extreme judder in panning shots and its clear as day in anime.... yet on my phone which is oled + BFI you can't even tell it exists.... I would kill for a proper 240hz+BFI oled gaming monitor. sadly, we aren't getting it yet.
i had that shitty ips viewsonic blur busters certified 1080p display which had BFI at 60, 120, and 240hz.... 60hz bfi made my eyes hurt. it was atrocious. 120hz + bfi wasn't bad, but looked like native 240hz.... and then 240hz vs 240hz+bfi was no difference because pixel response wasn't good enough to show a difference. so technically its ;capped to 240hz with or without BFI. however OLED actually has TRUE sub 1ms pixel response, so any BFI at any level would be amazing. I really want 240hz+BFI.... true 240hz BFI.
a solution to the native aliasing on crt displays for LCD/OLED? honeycomb subpixel structure. instead of a subpixel of red/green/blue we need a technology where 1 subpixel can produce the entire range of color in one subpixel. have it in a honeycomb/hexagonal setup. and since the subpixels would be so small, you end up with a sharper display as the PPI would be much much higher.
I think one of the reasons why crt got replaced very quickly in offices is because lcds have a more clear picture and is less blurry which helps with text.
Yeah when you go back to crt from playing on a flatpanel, plugging in like a Mega Drive into a CRT and pressing jump buttom feels like the character is jumping slightly before you press it. Not a joke it genuinely feels like that. Obviously impossible but its more of a feeling thing, must be something to with your brain being adjusted to lag.
Actually, power consumption differences are negligible in the favor of flat-panel displays at best. That’s only if you’re comparing square inch of display per watt, and using only their peak wattage draw, for like the extent of 2-4 hours. The problem with the comparison is that CRT’s are only at their peak power draw at the first few seconds of startup, and it’s continuous wattage draw is a significantly small fraction of that. Whereas, Flatpanels generally just reach their max power draw and stay there roughly continuously. Monochrome crts generally use like roughly a third of what color ones do even. Basically, leave em plugged in side by side one another on a power consumption monitor, and eventually a crt will probably end up consuming less power in continuous long haul usage.
Theres truth to what you say. My backroom LG 1440p 144hz monitor and I will put the wattmeter on it tomorrow to see its average. But I know for sure OLED will eat more power than a high refresh CRT. My buddies 27 inch/1440p-240hz OLED was using 55watt idle at desktop(dark background) and around 86watt and some frequent 100 watt normal figures while gaming at his settings(with 110watt during bright scenes), and 120watt on white background webpages. Idk how the mfg are cooking the numbers on the manual but thats fairly high.
Whereas on my Sony G520 in the living room(21 inch CRT) 1440p at 85hz(128.5KHz tube speed) will spike to 133watt during degauss cold startup and then 87watt average gaming and 96watts in bright scenes, with the occasional 101watt spike with a almost full white background website page.
Next to it(dual CRT setup for convenience lol) the Fujitsu/Siemens 21inch(shadow-mask) at 1200p at 90hz(112KHz tube speed) spikes to 126watts degauss cold start, then averages between 80-91watts depending on the scene gaming, and 104watts mostly white screen internet page.
Before I got the Siemens, was given a Sony HMD-A100(15 inch) by the same OLED buddy. It looks spectacular after a 10 minute warmup at 768p 85hz. It used a very low amount of power and surprised me. I put it in the shop as the music/info terminal.
Lastly my Samsung 997DF(19 inch) 1200p at 72hz is oddly close to the Sony 520 in power consumption but looks great. I put it in a vented box with a big moisture eater bag from work in there. I check the bag and bake it every 7 months to keep the Samsung in good shape for emergency backup. Sadly I dont have data on the last 2 CRT monitors saved on my computer, but I did test them when the wattmeter arrived.
Sorry for the long text but its rare to see someone who has modern experience with older display tech. I daily the CRT monitors(typing this comment out on the big Sony) because its vastly superior to my not so old 32inch 1440p LCD in every aspect besides screen size. Thats why I pick up 17inch+ roadside/dump tube monitors every chance available and test them, LCD seems terrible in comparison and I like having backups. Would buy an OLED but its just too expensive.
@@insurgentlowcash7564 Naw, no worries. That was insightful and kind of illustrates basically what I was getting at, with data, however anecdotal it is. Reading is no difficulty to me. I read and write effortlessly, by fortune of being effectively literate. 😄
@@insurgentlowcash7564 You can run a 30" 4:3 CRT on 100w. With a much bigger picture.
60hz flicker looks awful on a CRT, it was the bane of many a worker's existence if they were staring at it all day for work. 60hz on an LCD doesn't flicker the same way at all and modern zero flicker backlights are much nicer as well.
Pioneer and Panasonic plasmas are very close to CRT when it comes to motion clarity. I owned a Panny plasma for 12 years and gaming on my consoles was always a smooth experience. Keep in mind that the vast majority of these games were running at 30 FPS. Eventually my plasma burned out and I replaced it with an LG C2 OLED and I couldn't believe how bad the motion was. 30 FPS was simply unplayable because of the stutter, and even 60 FPS looked very blurry. And it wasn't just the motion clarity. Somehow the colors looked more natural on the plasma, and that was "only" in SDR. And then there's banding and uniformity. Both are still superior on a plasma. Anyway, I sold my C2 and went back to plasma. I bought an excellent refurbished Kuro and it's literally the best picture I've ever seen. I don't care if it's "only" SDR and 1080p. The motion clarity and natural colors are simply stunning.
Someone where i live is trying to sell a 720p pioneer kuro.... its tempting but not sure if it can fit in my room.
Id mainly use it just to game on 360, ps3 and watch movies via blu ray.
My panasonic crt is decent but 480i not 480p like later hd models.
It has component and overall, I love it. It was free from a church and given i have low contrast and just a tiny bit higher brightness than factory id say its very low hours.
@@roveradventures How big is it?
@@Stoddardian pdp-1130HD. Should be a 50". Has the box, and display. Speakers I can get on ebay for 20 bucks. I'm mostly concerned about hours on it. Ironically I have a cheaper pioneer set but the power board is nla and it'd cost more to get the board than a whole tv...lol.
@@roveradventures If it's cheap I would go for it. I got really lucky with mine. It was refurbished by a guy who worked as a Pioneer technician for 15 years. It still had over 90% of its original brightness left. It's an LX6090.
@@Stoddardian indeed, it was 120. But if I go for it I'll try to drop the price since speakers are about $20 on ebay.
Ideally they didn't use max contrast or brightness. The panasonic factory settings or the settings I saw. "Picture" is maxed out usually which can really decrease crt life.
I modified an LCD to strobe the backlight down to 1-2 ms (adjustable) and it made it so much more like a CRT with very clear moving objects, though it was a lot dimmer.
That's probably the only way we can achieve CRT motion clarity if manufactures try focusing on that rather than building something that goes higher than 500Hz.
The main 'latency' of an CRT is the time the phosphor needs to illuminate, so there is nothing that has no latency, but it's pretty much instant compared to any kind of digital image transfer.
So what analog signal are you using with your CRT? 🤔👍
@@MikJames-d1g Well VGA on PC Monitors (obviously) and RGB SCART on CRT TVs if possible.
My 20 year old 17 inch Dell CRT looks smoother than a 180hz freesync LCD that I just bought and returned. It's really frustrating.
Everything you said is indeed fact APART FROM.....when you said CRT's are ugly. They are beautiful goddessess.
In the Philippines there still crt shops in the rural areas they Repair and Sell Used Crt Tvs
The end-all video on CRT.
Extremely underrated channel.
Great video! I just honestly wouldn't downplay how great BFI modes are in a select few recent oled tvs, especially the LG CX/C1 and a few PC monitors tuned by Blurbuster. They can absolutely reach or surpass CRT's motion clarity. an LGC1 at 120hz with BFI maxed is incredible for videogames! It took basically 2 decades but we're getting there, finally.
I see i'm an inferior human for i never was able to see the difference between 30 and 120 fps.
if you saw 120 fps at a 60hz display then you didn't see 120 fps
Crt have been improved as FED and SED but it never came to fruition.
Came for the CRT commentary, stayed for being a fellow ZA boy.
LCD/LED has always been inferior but could you imagine how deep a 100” crt would need to be.
I still have a CRT monitor for retro gaming, but you need a deep desk, modern monitors take up so little room.
If manufacturers continued building CRTs, they would've improved on the technology the same that LCDs continued to improve. Samsung (before canceling it) got a 30" CRT to be only a few inches thick. That being said, I think the industry needs to keep working on OLED technology to get to the reduction of motion blur that CRTs were able to achieve.
@@GTXDash i agree that OLED is the kind of tech we should be supporting.
@@GTXDash Can't wait for multi-stack RGB-OLED with native HDR-RBFI, which will tide us over until eQD which will hopefully have a native rolling-scan modulation method.
Excellent points all around MXDash2! Thoroughly enjoyed how you put all this together and you're spot on for every point! I spent a pretty high fortune for my whole PC setup, and when it comes especially in running my older games I miss my CRT that was 1600x1200p. OLED at the moment gives me most everything I want and it's impossible for me to want to go back, but I haven't forgotten the perks of the old CRT. I actually back in the 90s ran my TV and VCR through my GPU because everything looked so much better on the monitor. Somewhere around 2005 I think was when I stopped using my Radeon AIW 9800XT (If I remember correctly) where I had all that set up and everything just felt great. These days it's amazing when it all works, but I'm plagued with games that don't work with windows, or multi-core CPUs are the issue, or graphic drivers obsolete.... A decent number just flat out don't work, and even more recent titles don't.
Believe me when I say this: I love CRTs, I have a 29 inch trini, rgb moded, hooked to an RGB-Pi arcade setup (OS4). CRTs are great for retro gaming.
That said, CRTs, generally speaking, are NOT better. It's just more cons than pros at this point. I just pray for my trini to last many more years. Even after doing a recap, I know it's not forever (it's already 15 years old).
11:11
I own both a CRT display @60 Hertz(It's a Sanyo VM4209) and a gaming monitor(BenQ EX2780Q @144 hertz) and I can affirm your claim.
They both feel about as smooth to use despite being made about 45 years apart.
great video. i have a mitsubishi 2060u, 22 inch 120khz. running it at 800x600 at 160hz it has the smoothest image ive ever seen in person
These are arguably Mitsubishi's best tubes they ever produced, better quality as far as fidelity and IQ than the slightly newer 2070SB tube, much like Sony's GDM-5002PT9 being better than the GDM-F520, despite being older, it's the phosphor quality and electronics being slight better in the slightly older models for some reason.
I love CRT tvs. I still have my "40inch" Toshiba CRT. Play my PS4 on it. The image clarity, sound (bass/treble/theator quality surround sound), color depth, contrast, etc are amazing to me. One of the many things I love about my CRT (and it's a major plus) is that the image & sound quality have not decreased at all over the years. Nor has the image started to look washed out/overlit. Every HD tv I've had (top brands too) have always lost image & sound quality as they start to age & always end up looking washed out. My CRT Toshiba still looks & sounds just as great as the day I bought it. Great video!
Sound from a CRT TV is always perfect
bro... not saying i dont like crts but i m not sure if i can take a person serious that talks sound from tv speakers... what about you just get an actual amplifier and speakers and use those instead?
Honestly I would pay real money for a nice CRT monitor. I rarely play super competitive games.
If you are ok with CRTs or Backlight strobing on modern displays, i honestly envy you. Here is my story.
I used CRTs in school and university and i had headache after some time. Bigger and flat ones were better for me, but still painful after an hour. Some of them ran at 100hz at 768p, also not helping much. Not happening on TV, but i viewed them from quire far. Never had that on TN, IPS, OLED. Friends VA also was okay, but I didn't had them personally.
On another hand, my cheap 144hz TN has noticeable motion blur. If i use Backlight strobing it does reduce blur significantly, almost to none. But my eyes hurt after a few minutes. I guess I'm sensitive to that thing.
yeah I never had headaches looking at CRT's growing up in the 90's, but I got headaches a lot while staring at LCD monitors once those became a thing.
I have a CRT monitor I got free from someone and I love it. It has downsides like, yeah, size and danger if dropped or if I ever need to repair to, but it works better for me. I game at 1024x768 at 118Hz (the best I can get out of it), and it actually works *very* well, I oftentimes forget that it's not 1080p. As long as I have antialiasing enabled, at least. I did have to scrub the damaged anti-glare coating off it because otherwise it was awful, so I can't really use it without having the curtains closed in my room.
I will not be getting any new LCD or any flat panel monitor until I can get something that is as good as a CRT in picture quality and feel. And I do wish I could exchange resolution for refresh rate on modern monitors.
I used a 17" CRT monitor next to a 22" LCD a few months back. The CRT would often develop a painful high pitched squeel. The CRT was dimmer & took a few seconds to reach peak brightness. The CRT's text was blurry. The CRT's image needed tweeking a bunch of settings to be mostly square and centered--and changing the resolution meant all those settings went wonky. I didn't really care about the weight and bulk when I later replaced the CRT with another LED monitor.
That just sounds like a crappy CRT.
Not to mention how much cheaper a much bigger/higher resolution LCD would be...
@@GTXDash
That just sounds like moving the goal post.
Finally a clear and easy video for learning
I think CRT monitors are beautiful, unironically.
I wish I could get my hands on a decent quality CRT that wasn't over $100
@@bylectricagarmoniya they arent
12:04 I didn't stop using CRT because of what you said. I had to buy LCD because there was no CRT to buy.
I wish I still had one. I used to have a 21" Mitsubishi. I miss it. I would even take a 21" ViewSonic that's how much I want one.
I"m still using my CRT for playing games, sometimes i use the LED ones for some games and aplications such as Adobe Animate, Blender, Premiere but my main playing monitor is the CRT one, Motion Clarity even at 60 hertz is another level, there is no modern scream that come even close to what CRTs can make, there's no input lag, colors are great, natural bloom, black levels.. CRT is the king.
The biggest issue with crts and no one seems to talk about they are old and will require servicing. A lot of the parts like flyback transformers are unobtanium. High end CRTs in good working condition command high prices and haven't been serviced, so you are taking a bit of a gamble. I own many crts and I have yet to open up a single one without it needing significant work or servicing. Out of spec leaking capacitors is almost guaranteed at this point.
That's why we need to enjoy them while we still can. You don't need to buy an expensive unit for it to be working perfectly fine.
I have 3 crt monitors that i found on the side of the road for trash pickup and they all work perfect with no issues. I found a samsung sync Master last week and the picture is insane. So freaking amazing
there is nothing about a late enough made model that you cannot repair with easy found parts right now. You might need to consult a repair show about how this replaces that, but its doable.
@@xBINARYGODx you can’t get a lot of the discontinued parts, they have no new tubes and fly back transformers on the shelf….
Black frame insertion done well on modern tvs would definitely make crt obsolete
I hope so. If it's a high refresh rate display where 5 out of 6 frames are black while the 1 is bright enough to compensate for the 5x darkening caused by those 5 frames. Sounds outrageous but that's really what it's gonna take to reach that level of no motion blur that CRTs are known for.
@@GTXDash yes
Standard BFI, that is straight impulse modulation can not produce the same type of motion IQ and performance that raster-scan CRTs can, so for BFI to make CRTs obsolete, they will need to come up with an effective "race the beam" algorithm to simulate raster-scan, this will also need to simulate the incredibly fast phosphor decay times and phosphor glow and all that good stuff that CRTs have, the hardest thing to match will be the CRTs incredible native image depth, which with a good quality input source can be almost three-dimensional, looking glass, aka light-field tech is a good candidate.
If only modern gpus had vga output
There is always the display port connection to VGA.
VGA to hdmi works just fine
@@hehashivemind6111 yeah but you introduce some lag
Display port to VGA works, the issue is GPU drivers for custom resolutions.
@@ksysinf its generally such a low amount on most of them that it realistically shouldnt matter
i hated 60hz so much on a lcd i would play splatoon 3 on my crt, was able to get it to do 720p, but it died shortly after :(. Was cool af watching it die tho started getting all wavey turned red and slowly started to fade to the center till there was nothing.
My friend had a giant ViewSonic CRT and its still one of the best pictures Ive ever seen.
The best thing about CRTs over the other types: no latency. Zero.
I might have missed it but other advantage crt has over new monitors is that, because of the lower resolution you can choose, games at that resolution benefits in performance but at the same time they look better at that resolution in crt than a newer monitor because of how crt works. The only downside of them are maybe HUD elements that are programmed with higher res in mind being out of proportion
OLED's geting close enough to the point its hard to even see motion blur \ ghosting. "1ms" high refresh rate TN panels suck balls by comparison and they used to be king for LCD ghostless gaming, but their nothing compared to the current top gaming oleds. It just took nearly 30 years to get LCD tech to be good enough to equal a CRT. I personally don't care about going back. And I am one that lived through the CRT era. My Last CRT was NEC could do insane refresh rates up to 1024x768 , and could do max VGA spec at 2048x1536 @ 60hz not that that mattered since many vcards din't have enough vram or their ramdacs at that time struggled to go that high with out looking like crap. I have zero desire to go back to that era I have no nostilga for scanlines. Old CRT's also blur as they get older from constant use assuming you don't have other issues like purple tinted screens as the caps fail.
Don’t forget phosphor dimming and burn in too.
Regarding blurring issue. You can actually tune them to be perfectly crispy again. They have adjusting screws/knobs, accessible when you remove the case, which allow physical tuning well beyond what's achievable through menus. I picked up multiple CRTs that were written off as "blurry", tuned them up and used/sold them. Never even encountered one that was blurry beyond saving tbh.
The thing is, technology wise, CRTs didn't reach their peak when they were replaced. We'll never know how good CRTs could have been as a technology.
@@Skrenja CRTs literally had between 70 and 90 years of development and they are still only marginally better than the best OLEDs we currently have. There is simply only so much you can do with an electron gun and some phosphor.
Thanks for educating me! You definitely know your stuff!
You missed to mention something you probably know very well. And that is the fact your eyes start to hurt after 1-2 hours on an CRT. This reason alone is enough for me to never even attempt to game on such an eye burning mess again.
And don't forget those sweet ionising radiations 😊
@@Kourku Continuous exposure to CRT Monitors exposes you to between 0.05 - 0.60mSv/y, so if you had a CRT monitor constantly powered on next to you you it would be equivalent to getting a chest x-ray done every 20 years or so
lol what do you even mean I can game for like 4-5 hours on a CRT and have zero issues
@@phil_matic happy for you. I remember different story as a kid sry.
@@pecata What you experienced is exactly what my dad tried to warn me about, but I'm thankful I never had that issue
12:30 im sure the people that repaired those things where smart enough to wear thick rubber gloves so they dont get the shit blasted out of them
I got an Asus 27" 1440p woled and an Acer 390hz ips monitor and with them I am still mid in say warzone and bf2042, but with a crappy 17" samtron crt I am always on the top charts and even being called a cheater. Explain that :P a cheap CRT even at 85hz with zero inputlag is the gaming king...
Playing CSGO on a similar set up, Phillips CRT at 85hz. Although I’m not pro at the game I can comfortably play and the lag is nonexistent :)
In MP games i go with 120hz on my crt and im being called cheater as well
i used to play with a dual seemless monitor setup of 320hz each and for like a month of playing CSGO every single day for hours, i could barely kill one or two people a day. a week ago i bought a chinese 1990s completely trashed but working 12' crt monitor and ihave been on the top 10 players IN THE WORLD this past 5 days. thx crt
Crt started to died when there was a attempt at 29 inches TV that didn't last longer than the other small ones.
Appreciate the Unreal Tournament gameplay
As some have mentioned, the LG CX and C1 are OLED stand outs so far as they actually employ rolling scan to achieve a CRT like effect yielding an effective 300 Hz or more of motion clarity. This is possible, because with scanning the display as opposed to inserting black frames, they're not limited by the maximum refresh rate. (120 Hz rolling scan is the same on both models. However, on the C1 the effect at 60 Hz was greatly diluted following complaints of flicker on the CX at 60 Hz.) 300+ Hz is still well short of the 1000 Hz needed to approximate CRT. However, I think returns are likely diminishing as even at 300+ Hz equivalent the moving image is already starting to resolve in a really dramatic way versus 60 Hz or 120 Hz without rolling scan, which are both still quite a blur in motion.
So being a CRT enthusiast, these two OLEDs are the ones I went for. They are getting hard to find now, but some are hopefully still out there. (E.g., might get lucky on eBay, Amazon returns...)
As to complaints regarding CRT, please note that a whole range of these devices existed from extreme budget to ultra premium. That combined with folks very unfortunate habit of leaving them set to the 60 Hz Windows default was bound to leave some with unpleasant experiences. However, I run mine at 100 Hz. And they still present gorgeous image quality.
Wait, what? A CX and C1 don't use BFI like other OLEDs? I'm confused. Will their BFI make 30fps look smooth like a plasma? Because the 60Hz BFI on my C2 looks awful at 30fps. Meanwhile on my plasma 30fps looks fantastic in motion.
The CX and C1 are different. If you're actually inserting full black frames, you're limited by the maximum refresh rate. For example, a 240 Hz display is limited to 120 Hz BFI. The CX and C1 actually contain a piece of additional hardware independent of the screen's max refresh to scan the display instead. They are thus not bound by how many full frames it can do a second. This allows those two models to achieve in excess of an effective 300 Hz or more in motion. Sorry, I'm not familiar with how a 30 Hz signal is processed by any of these displays. Though I'd guess interpolation or repetition with 30 dividing evenly into 60 or 120. And then with the plasma you have the advantage regarding how it's driven giving an effective motion resolution of 200+ Hz I read. Whereas the C2 is being limited to 120 Hz, because the special hardware is gone. It's thus limited to inserting actual black frames I think.
The fact they nerfed that is insane to me. Sample-and-hold displays are infamous for their poor motion clarity and then they nerf the one thing that could improve it? Why? Seriously, what you're saying sounds almost like plasma motion clarity. If I understand you correctly, the CX and C1 actually had a rolling scan like a CRT?
@@Stoddardian Yeah. More mimicking CRT with a rolling scan pattern as opposed to inserting actual black frames. Better than plasma motion clarity as I understand it, but still less than CRT clarity. (Around 3.2ms image persistence with this rolling scan on the OLED versus 1ms on a CRT. OLED pixels aren't as bright as CRT phosphors, which constrains, but still a pretty spectacular result.) Why they did it? Apparently cost saving by removing the hardware and associated costs of further development in the face of Samsung re-emerging an OLED competitor challenging LG on the basis of brightness. I think that's what I've basically heard and could make of it.
I would love to see it in action with console games running at a locked 30 FPS. If it works then the decision to remove it is pure insanity from a picture quality perspective. I mean, what on Earth were they thinking? As far as plasma motion clarity is concerned, the best plasmas actually had a persistence of less than 2 ms. Anyway, I hope they bring the rolling scan back eventually.
Nothing beats a crt! And I have an oled c1, a crt and a plasma so I can say it because I am always comparing... specially the movement of cameras that are very noticed on oled/lcd but not at all on a crt...
It is funny to talk about motion quality of displays, when game developers still fok up their camera movement to an unnatural jerky headache making shock experience. I've yet to see the first ever FPS that has a camera movement that does justice to anything above 24fps.
Oh God do I just not enjoy 1st person. Immsim genre is fine. But otherwise 3rd person just feels better. Like, no one plays Warframe and thinks "wow this motion sucks it's so dizzying"
The PS3 on my widescreen 480p crt at 30fps plays smoother than many PS5 games on my Oled.
Not even kidding, just try Resistance, amazingly smooth 30fps.
Many PS5 Games run in 30 FPS. But yes, CRT always feel smoother at the same frame rate.
I forgot the name of the tech... it was supposed to replace CRT with a pixel grid structure like LCD, but, each pixel was a very small cathode ray tube like pixel... that would've been a good replacement... I think. Maybe, though, the motion blur would still be a problem with that as well.
FED / SED
@@ShankMods thank you
@@ShankMods Makes me sad every time I think about SED & FED, even Pioneer's Kuro tech got shut down, seems like quailty displays are only destined for the professional grade markets.
@@Wobble2007 I have high hopes for Electroluminescent Quantum Dot and MicroLED