Yes, but most 32-bit systems weren't capable to pull of 4K (it was 2048x1536), and then by the time 64-bit systems become the norm, CRTs are already extinct.
@@MakotoIchinose Effectively, a modern VGA system should be capable of 1080p (1920x1080x16777216 colours). However, go back far enough and basic VGA chips would give you very basic results. For example, the standard in the 90s was 800x600x256 and in the 00s, 1024x768x65536.
@@anonUKI think space charge is the limit on CRTs. While with a laser scanner you can focus almost infinite power ( until you get second harmonics , like your bright red gets some violet glow ), you cannot focus charged electrons ( for the given current). Accept a dark picture and watch in a dark room. Or accept that this blurs the picture even more than the mask and that you cannot identify single pixels anymore. Can as well go interlaced as this doesn’t flicker. CRTs show scanlines in the dark areas of a picture.
The cables however are made to a specification, and at a high enough frequency you'll have real trouble properly sending it through the cable. High frequency signals start to get really tricky to play nicely with.
7:50 OH MY GOD! I have a dial on my car dashboard that turns the brightness of the dash lights up and down. Every time I've used the dial over the past 5 years I got some really strong nostalgia and couldn't figure out why. It wasn't until now that I realize it's the same texture and feel of the dials on the bottom of my old family CRT that my dad would always get mad at me for messing with as a kid 😂
@@BlackEpyon I noticed the HP16500A as well. You've got to be a special kind of guy to appreciate a 1650x series LA. I upgraded to a 1670x and 1690x. My mistake, it's a 16500C.
How does overscan work on HDMI? Do they have part of the LCD panel hidden in the bezel, or do they have an LCD panel with a slightly lower resolution but marketed as being 1080p. I'm curious because I have a 1080p LCD tv and was kind of annoyed when I found out that 10% of the screen was cropped-out.
@@thepuzzlemaster64 It works in the worst possible way you could imagine. The physical display will actually be 1920x1080 (or whatever resolution is claimed), but the overscan will scale the input pixels so you get scaling effects and reduced de facto resolution. There is no pixel-for-pixel option. I learned of this awfulness when I hooked up a computer to a cheap Polaroid brand HDTV. When hooked up to HDMI input, the edges were cropped out (annoying but compensatable) and the text was always crummy looking (unacceptable). But there was a VGA input also. When I hooked it up via VGA, there was a menu option for pixel-for-pixel. No cropping. Sharp crisp text, albeit with analog-digital conversion fuzziness. To add insult to injury, the HDMI input actually DID allow pixel-for-pixel output if the input signal was less than 1080p. If I set the resolution to 1024x768 on HDMI, the display was razor sharp ... but limited to a 1024x768 box with huge black borders all around. Ugh. Absolutely despicable. I now have a cheap SCEPTRE 4K TV which I use as a computer monitor. I knew from reviews that it does NOT overscan at 4K, or any resolution other than 1080p. But for 1080p? It ONLY allows overscan. This is slightly better than that damn Polaroid, because the pixel density of 4K practically eliminates the bad scaling effects. But it still would have been a deal killer if it weren't for the fact that I knew I would never use this display with 1080p resolution. Grrrr....
A relatively interesting fact is that even though HDMI is a digital video standard, its video signalling a direct descendant of DVI, and DVI retains the synchronization pattern from VGA with HSYNC and VSYNC surviving as two control bits. DVI keeps the front and back porches alive and HDMI even makes use of them by optionally encoding audio and auxiliary data signals into blanking intervals. (The funniest part of the spec is the sentence that goes roughly like "each data packet is 32 pixels long") I suspect that DVI was designed to be implemented by replacing the DAC with a high-speed serial transmitter and leaving the rest of the video chip intact (knowing the hardware people, I don't think it's too far from the truth) and the spec is explicitly permissive of pixel timings, so we're lucky that modern displays work at all with all that legacy.
I feel so inadequate watching this highly excellent technical video on the old VGA standard... Please continue making these videos where I can feel small and learn stuff...
But VGA is a fantastic video standard, like he explained in the vid, it is much better than the digital standards used today mainly because of the lag they produce, VGA has almost no imperceptible lag, it's amazing.
Notice the VGA's 31KHz horizontal sync rate is almost exactly double the 15KHz NTSC horizontal sync rate, and the original CGA/EGA 200-line modes were 15KHz NTSC compatible modes.
Technology Connections youtube channel said so many times that there's no defined resolution for analog signal. It's just artificially divided into lines. VGA adapter has just two modes 400 and 480 lines. Many cards don't even display 350 line mode, but use 400 line mode drawing 50 lines black. Even though VGA adapter is complicated hardware mess with tons of registers that can set a lot of different modes, almost all of them use 400 or 480 lines. VGA has unique 'double scanlines' in low-res 200 and 240 lines modes, because hardware displays each line twice. Base VGA cards are also able to display 600 line modes in 56 Hz. Windows 3.x SVGA 800x600 16-color driver will work on any VGA card, but monitor capable of 600 line modes might be required. VGA cards can display various modes at 200, 240, 300, 400, 480 and 600 lines. Some games also support 224 and 256 line modes, but my (S)VGA cards synchronize display to higher line counts. Horizontal resolution can be really anything. It's just analog signal. It can be as low as 240 and as high as 800. I've encountered software that used: 240, 256, 296, 320,360,400 horizontal resolution. DOS Quake shows this really well offering horizontal resolution of 320 and 360 pixels at 200, 240, 350, 400 and 480 vertical lines. These are all basic VGA modes. Highest 256-color VGA mode is 400x600. Ironically VESA modes aren't that flexible, but VESA is just simpler and better. VGA is limited to 20-bit memory addressing of x86 real mode. Which maps only 64 KB window of video memory above base 640 KB. VESA compatible cards have newer 32-bit memory address mode, where linear frame buffer is located at the end of 4 GB address space. However when VESA was finally getting popular it was killed by Windows 95 and graphics drivers. VESA never had a chance to evolve into dedicated low-level graphics API with increasingly more advanced features, maybe OpenGL or hardware 3D support in general being added. All GPUs are still VGA/VESA compatible, but just as a way for OSes to display GUI on every GPU without specific drivers.
One day I connected a 32 inch LCD TV to an old PC (Pentium I or III, don't remember which) with an S3 video card through a VGA cable, started some DOS-based hardware identification software (hwinfo maybe) and it showed information about the TV (maker, type, size, a few things). I hope you will talk about this identification protocol, I did not know about it.
EDID. Basically there's an i2c line in the connector and you expect that a TV has an i2c EEPROM connected there directly or a device that can behave like one. It simply being an i2c chip is advantageous because that can be powered by the cable, otherwise the monitor or TV would need to be turned on before it can be identified.
what brand was the set? thats interesting. I just bought a preowned sony bravo 40" tv yesterday and it too has a vga socket on the back. I wonder if my pc would be able to pull some info off it if I did the same. Its 1080p
@@coolelectronics1759any remotely modern screen/adapter with VGA input/output (or DVI, DP or HDMI for that matter) will support EDID in one form or another. The 1.0 standard goes all the way back to 1994.
Little secret of HDMI: The 'image' that HDMI sends raw is actually a fair bit larger than the picture - and those extra pixels carry the audio information. It's designed that way because it simplifies the electronics at intermediate stages. The device generating the signal and and device receiving it need to deal with packing and unpacking the audio, but everything in between just needs to pass three differential signals, clocked. No framing to worry about. Can ignore the 8B10B encoding. As far as everything in between the very start and very end of the signal chain is concerned, it's just three streams of bits. A noteworthy omission is that none of those extra pixels are allocated for closed caption data.
Horizontal and vertical blanking and synchronization information is contained beyond the active portion of the video signal. For a progressive 60 Hz 1920x1080 signal, for example, the total signal containing horizontal and vertical sync data extends the overall "image" to 2200x1125, with 1080 active vertical lines of horizontal resolution and 1920 vertical lines of horizontal resolution. All sorts of data can be stored in those non-active lines containing sync pulses, but some must remain so the display knows the end and beginning of each succeeding frame or field. No display is capable of running without some blanking or sync data, CRT, LCD, plasma, OLED, or otherwise.
@@NUCLEARARMAMENT That's not what HDMI overscan is. It's the source that add black pixel *inside* of the 1920×1080 window because the TV want to enlarge that digital picture.
I'm perplexed about the last bit. Sure the high frequency signals needed for high-res/high-refresh modes certainly needed good quality cables, but what do you mean you don't know if it's worth it? Not only was it worth it, but it was absolutely necessary. Good quality CRT monitors with low-persistence phosphors at 60hz (and at high resolutions) are headache inducing after a few minutes, nobody I know would ever run a CRT at anything lower than 75hz, and for me 85 and up was extremely desirable. And given good quality cables and connectors, the picture quality is excellent, no degradation at all. I have no idea how they managed that, it seems like black magic to me, but they absolutely did.
I've been looking at various different VGA and composite signal drivers from old retro systems like arcade boards, commodores, game consoles, and many docs would mention the "blanking" period, and show it as just this time period to do nothing. But you are the first to really describe as to what that timing actually is, and why it exists. A very eureka moment for me. Thank you for your content!
Multimode CRTs existed before the VGA video standard. NEC was marketing their original MultiSync and MultiSync II already when the PS/2 showed up. They later included a HD-15 to DE-9 adapter to use the monitor with the fancy new video standard. VGA could also output 15khz video to NTSC or CGA timings with some register poking. Regarding high refresh CRTs, the bigger issue is usually the quality of the video card's onboard DAC. Most outputted muddy blurry video at higher refresh rates. Hook up a card with a quality DAC like a Matrox, and you clearly see the difference. Whats odd is that video card vendors never really marketed this, just the resolution, color, and refresh rates!
I don’t get how the amplifier in the CRT is not always the limiting factor. More difficult to amplify to 60 V. How did they even shield this to avoid radio emission?
Jokes aside, iirc the X was for eXtended, but im sure someone bought some 24k gold monster XGA cables thinking the X was for eXtreme Graphics and that it would make his low res textures look better and double the game's poly count.
1024x768 resolution is also known as 8514/A, named after the model number of the first IBM video card to support it (introduced with the PS/2 line in 1987). It was replaced by the XGA (eXtended Graphics Array) card in 1990 and then by the XGA-2 card in 1992.
Pretty much what people know as VGA is down to it using the same cable, each new standard offered more than the last but still had the same name and the same connector, it would probably not caught on if it weren't for that and would've died as quickly as EGA did. It's even managed to outlive a few of it's successors (DVI, ADC, Mini DP/Dell's USB video port, etc), still used as standard for cash registers, projectors and server displays. Next to the x86 architecture, it's probably one of the oldest PC standards still in use today.
I made a device for a visually-impaired friend, a matrix switchbox. A charge pump to turn the +5V available off the VGA port into +-5V, which drove a high-frequency three-channel fixed-gain amplifier (AD8075), then a set of three 1P4T switches with 75Ω resistors. Pass all the sensing pins and sync straight through. A handy little box with three switches that let you manipulate the three color channels in VGA - swap them around, remove one, duplicate one onto two or three. Have trouble on games because you are red-green color blind? Swap the green and blue channels, and those red/green meters turn into red/blue meters you can see more easily. Doing the same on HDMI would require far, far greater electronics skill - you'd need to program an FPGA. But on analog I could not only do all that on analog processing, I did it almost entirely on stripboard too - the only SMD part I needed was a tiny carrier board for the amplifier. Did have to keep all the wires as short as I could though, at that sort of frequency. I was really surprised that there was no visible loss of quality, even at the highest resolution my monitor could handle.
That's awesome of you to help out your friend like that! I wish they'd employ people like you when designing cars. My friend's digital dashboard has a red line on a green overlay for his speedometer, but he's red-green colourblind, so he literally can't see how fast he's going! I've installed LineageOS on my Samsung Note 3 and it has display settings allowing me to adjust colours and help compensate for colour-blindness. Just thought I'd let you know this is available if your friend struggles with smartphones. Also lots of new cars use extremely thin strips of LEDs circling the headlights and brake lights for their turn signals, and for anybody with astigmatism, thick glasses, or poor colour recognition, the dull yellow LEDs get completely swamped by the much brighter lights next to them. It's like if you asked somebody to design the absolute worst way you could implement a vital safely feature, that's exactly what they use now. They never once considered people who don't have perfect vision, who make up 20-40% of everybody if you include just colour-blindness and wearing glasses.
@@Microwave_Dave Oh, here: ruclips.net/video/8HBHJB0lAL0/видео.html I've already described the circuit in sufficient detail that anyone with a reasonable skill level could build it. But if you have use for such a thing, I could draw up a schematic such that anyone who knows which end of a soldering iron to hold could build one. It's remarkably simple to construct, though VGA is becoming a rarer connection to encounter every year now.
Never thought I would say this, but that is a fantastic thumbnail! The video is great too, your channel is clearly evolving in a very good way, keep up the good work!
Wow, brilliant video Shelby. I never really put much thought into the intricacies of VGA standards but this video sure changed that. Thanks for the detailed explanation
I remember when the NEC multi-sync monitors came out. Those things were game changers. I also remember when i bought my first PC clone, and had to read the monitor instructions on how to set the video card sync polarities to get it to properly work. Good times. I still have that machine.
I never really appreciated the wican voodoo in how problem free (for a consumer of VGA) the whole process is. Learning more from your fantastic production has left me in deep introspection while watching a progress bar crawl along with stunning clarity on a VGA NEC AccuSync LCD71v. I feel the universe has a higher error rate drawing the room around the NEC monitor. A miss scanned draw line now and then not only seems acceptable after watching... But almost unacceptable an error seemingly never happens. Your video has made me appreciate this NEC display. I have used some janky displays and own a bunch too. I just tried using an off brand TV as a Hdmi display and the overscan was more than an 80's sears TV console. I did not know what was going on till you mentioned the phenomenon. Anything that needed overscan is [likely] 4:3, so... I feel your plight. The scaling on this cheap overscan Hdmi TV/moniter is so horrendous, text is unreadable. The same TV is stunning with a VGA cable. This production has made me appreciate this NEC, there is no appreciable delay in the mode changes, and the image is perfectly centered and in phase, and faster than most tube monitors I encountered. Many of my other displays need the image for a moment, and then a correction of up to inches is made. Fun fact, this mentioned Hdmi display (for a while) replaced the NEC on my 32 bit machine (both connected via VGA) When booting up during the change from the bios to Linux splash screen, the NEC is seemless, the mentioned Hdmi TV display manifests the Mint logo half off the right side of the screen, and moments later is attempted to be centered. Because the backround and border of the splash screen is black, the display has no reference and never quite hits a proper center. It's cute. It will not center your desktop if the backround has black edges either. How a non electron gun display handles something like a coax input or composite can vary wildly. I feel there was a purposeful implementation of poor quality handling of these inputs by certain manufactures to push new media sales. Example: two 32" LCD (band a, brand c) tvs playing same VHS, same VCR, composite output. One looks great, one is awful. I would think there be no appreciable difference. Set top DVD recorders had the same wide swing in quality among brands. My Sony on a 3hr burn is okay quality at best. The same quality is present on my no name burners on a 6hr setting. But the Sony will push the VHS out its HDMI output seemingly upscaled. (if you burn a tape on the Sony, the encoder rears it's head. So you play the tape on the Sony, while burning that output with the no name burner, lol. [my cyber home DVD burner ignores macrovision signals and pirates the Divix encoder that bankrupted the company] ) My best means of analog capture I have found was the little boxes you put in line of the VGA, and S-video into a set top burner I mentioned, and ripping the DVD. The in line box will alow you to set size, Pal/ntsc, color, zoom, brightness, etc... I use mine for a hitachi 72" projection TV and I can read the standard system font at 1024x768. It's not laser clear but the TV has like 800 scan lines. I would not think there be enough to scan back up but it looks pretty good. Back in the day it would be stunning. I look forward to the rest of the series. Happy Day!
What about the beautiful picture and inky colours and black, zero motion blur, zero lag, no fixed or native resolution so looks good at even low resolutions, even OLED has bad motion response, and a fixed native resolution, not to mention no OLED monitors available yet, I will admit when 24/27/32 inch sized OLED's with 120Mhz refresh rates, improved motion response are available at reasonable prices, say starting at £$300 for the 24", £$500 for the 27" and £$800 or so for the 32", then OLED will be a good alternative to CRT monitors. Until then there are CRTs's that can do 160Hz refresh rates, go up to 2048x1536 @ 90Hz on a CRT is a beautiful thing I can tell you, nothing comes close to that 3 dimensional beautiful looking bright and colourful image that a CRT has, the image they produce has so much depth which makes gaming so much more fun and immersive, my gaming 32" 144Hz VA G-Sync monitor just looks so flat and dull compared to a CRT, they have tried now for 15 years to get close to CRT with LED, Quantum Dots, HDR, 340Hz+ refresh rates, dual panel LCD's (best version of LCD), multi layer LCD's, and none of has even come remotely close to CRT, and it never will. Even OLED has its limits, maybe LCOS or Laser Phosphor will manage to finally knock CRT of its pedestal, seeing as it's true successor FED/SED technology was shutdown, we would have 85" CRT quality flat screens by now that would also have gotten cheap by this point too, ah well, maybe in 10 years we will finally have something worth buying, maybe even CRT's will be manufactured again.
@@coolelectronics1759 So what if their big and bulky lol, I understand if you only have a small place with not much room, but I will take a big CRT over a cheaply made paper thin LCD any day. Modern CRTs don't have lead in them but even if the di so what, it's in the glass, also radiation? lol it's a tiny amount of no ionizing radiation and is harmless, CRTs are much better for your eyes too, slimmer = cheap and nasty. Although there were some very low profile slim CRT's from Samsung that were pretty good.
@@Wobble2007 gotcha understandable. I was just under the impression the crt was just considered nothing more than obsolete junk at this point, as noone would take the ones I had for sale off my hands not even for free, even sony ones the suposedly nice one, and I had to toss all of them unfortunately. Its such a shame but I gues there is somewhat of a demographic for them still you just have to know how to move them no pun intended lol. I sold a typewriter once, last year! Sometimes you really have to sit on these things for them to sell eventially same said for other hard-to-sel items-Telephones, printers, grampa floor TVs, pianos, fax machines, scanners and VCRs. I myself dont have the thinnest lcd either, I payed nothing for it it is a small (Hansin?) 19" monitor and its good enough for me. For CRTs, I gues if an oscilloscope counts, I have two of them one in front of me and one in my tech lab.
@@coolelectronics1759 Bloody hell mate, you would've made a small fortune on the CRTs now, a simple 17" run-of-the-mill CRT is going for 100£$ plus on eBay now, that said I wouldn't personally pay that much for a 17", the SONY 24" FW-900 is in the thousands now, up to 7 grand, I'm trying to find one for a grand or less, there are bargains to be had still, out of interest what was the best CRT you threw away, makes me sad to even think about tbh.
Excellent video. The thumbnail depicts a perfect visualization of "video graphics array," and the presentation was spot on. i agree, staying away from old VGA monitors is wise unless a person highly desires a specific monitor made for or built during a computers timeline. Even then it would hardly be worth it. Though I do remember when flat panels came out they were the absolute thing to have and no one seemed to complain about latency with legacy games at the time, I guess we were too dazzled by the insane new monitors to really care.
I have the opposite experience... No one around me would have bought a rubbish's "flat screen"... And I used CRT for LONG, just because of latency, and general image quality...
Excellent presentation Shelby. Thank you! I remember seeing my first VGA image on an IBM monitor driven by an IBM Model 30-286 PS/2 ( the "30" was the size of the HDD in MBytes of course ) in 1987 at work. I had only experienced EGA before this, so it was like looking at a hi-res ( at the time ) photograph! It was an IBM stock image of a wharf with fishing boats and mooring lines. Such fine detail blew me away!
The nicest monitors I have ever seen are Mitsubishi Trinatron multi-sync monitors. If you ever see one in the wild, buy it. I have worked with literally hundreds of different CRTs over a 20 year period as an IT professional. Everything from early IBM true VGA monitors to 21 inch behemoths from Compaq and Apple. Nothing even comes close to the Mitsubishi. The NEC multi-sync monitors are good too, but not in the same league with the Mitsubishi. Apple monitors are generally good too (though not all. there are some cheap poorly built Apple monitors). But they do lose some of their good properties when hooked to a PC through an adapter if early ones or directly with latter ones with the VGA connector.
You've worked on hundreds of CRTs but don't know that Trinitron is Sony and Diamondtron is Mitsubishi? If you're referring the the Megaviews, those are dotmask anyways, not aperture grille...
One nice thing today is that there exist DACs and ADCs fast enough that you don't need to generate a mode-specific clock even at crazy high resolutions and refresh rates. With those really high rate display modes, the converter can be wired directly to the female D-sub connector from the board, then you don't need to worry about interference and cable quality.
Also the old VGA was a cheap hack. The 480 lines come from NTSC, 480i for the visible image. What VGA did is to create a 60p version out of it, meaning just doubling the vertical frequency of a monitor and sending line after line (no interlace). So you could use TV-similar circuits to produce VGA monitors (the stuff after the NTSC decoder to R/G/B). Some VGA monitors even synced as well to an R/G/B 480i signal, output by an external TV tuner for example.
I actually run 15" CRT in 1024*768. It worked, but won't go up 43Hz Interlaced, or something like so, which was incredibly useless. Also, that ridiculously high-speed CRTs? Good luck with finding video card working with those five BNC cables interface :)
Yes, and even those were just the _palette_ colours. 64 was NOT the number of _simultaneously displayable_ colours, which was limited even further, to a choice of 16 (from the palette). This was due to EGA only working with 4 bits when it came to the actual image to be displayed. The monitor with its 6 colour signal lines did support 64 colours, the standard EGA card did not, a design choice that may have been related to video RAM limitations. @Tech Tangents: Do you have any explanation or excuse for saying 256 colours there? Was it just a silly mistake, or what was the thinking behind that?
@@ropersonline chill. :) I think it’s just a simple mistake. With some tricky programming you could actually get all 64 colors on screen at once. But that wasn’t really usable for games.
@@root42 Tone policing people isn't very chill yourself. Also, compulsively answering the part specifically directed at Tech Tangents exhibits the same unchill qualities and isn't a good look. So, chill? Pot, kettle, black. I would be curious to find out more about those 64 colour hacks though if anyone has any information. Especially if that was possible on original IBM EGA hardware, with or without a memory upgrade. It would depend on the resolution, at least.
@@ropersonline one game that used more than 16 colors is the stat screen of Lemmings. The game switches palettes halfway through the screen to get 32 colors. In the same manner you can more or less switch palettes every scanline. Similar to the Amiga HAM mode. I think I saw a demo where someone made an image viewer that worked like that, but can’t find the link right now.
@@root42 Stat screen or start screen? I'd love to see a screenshot of that tweak in action. The 640x350 EGA start screen screenshot I found at mobygames only has 16 colours. Some VOGONS info I've just googled also seems to suggest Lemmings even switched palettes in 320x200 mode, which is weird, because on original EGA hardware you couldn't use non-CGA colours in 200-line mode, for reasons of CGA monitor backwards compatibility. The 64-colour palette was for the 350-line mode only. Perhaps that "EGA" 320x200 palette switching only occurred on odd EGA clones or VGA hardware. VGA was supported in its own right though, so any VGA owner would have been better off running that.
Cards in teaser: Datapath VisionRGB E1S, Epiphan DVI2PCIe and Avermedia Broadcaster HD. Personally I use VisionRGB with VCS (a third-party software made specifically for Datapath cards).
The last widely available samsung CRTs could handle extreme resolutions, but were of the shadowmask type and came bundled with a very thin cable. Another point of mismatch occurs at the connector of the video adapter. A DVI with an adapter might actually give a clearer signal. But if it is not working well, it's just a light ghosting echo, often masked by the pattern of the picture tube.
I remember that I had a Nvidia card that supported 100 hz(I don't remember the resolution that it supported in that frequency) and a Philips monitor that supported it, while the image wasn't at the same quality(bleeding, chromatic aberration and more) it was cool seeing the mouse and windows moving way smoothier than normal
Looking forward to the VGA capture video! Have you looked in to ways of tapping directly into the image signal before it gets sent to the DAC like what RGB mods for retro consoles do? I'm just curious if it's been done before.
"...tapping directly into the image signal before it gets sent to the DAC..." I suppose it's possible, but that might be limited to some of the earlier VGA cards. From the early to mid 90's, a lot of VGA chipsets began putting the DACs on the same die. This was the 512k or 1M 16bit ISA cards that might do 1024*768 @ 256 colors max. Other options: DSPs on each of the FIVE outputs (R, G, B, H, and V) would get expensive real quick at higher resolutions and frequencies. There's also reading (snooping) the VRAM, processing it digitally, and outputting it another way. Trying to read the RAM while the VGA chipset is trying to read and write to it might be a no-go. Emulating the VRAM with something more modern and faster might do it. An FPGA implementation might work instead of doing a ton of reverse engineering. Signals could be manipulated digitally or some gates could be used as DSPs.
@@anomaly95I like how the Atari Jaguar has the R2R ladder soldered to one side of the CMOS chip “Tom”. I have not seen resistors in CMOS. How do you approach your fab about this? OpAmps are huge as are charge pumps.
I've known people who can't tell the difference between 60hz and 85hz, but I could, for better or worse. It gave me eyestrain, particularly when we were at the later stage of CRTs with higher resolutions. LCDs don't bother me at 60hz though. I've spent hours in front of monitors every day for decades, both for work and play, and despite my nostalgia for older technology, I wouldn't want to go back.
I've had great success with my Epiphan av.io HD capture box. At US $429 retail, it's not cheap, so might not be a good solution for hobbyists. But you can buy refurbished and used ones for half that (I got mine for $175.) You connect a DE-15 cable directly to the box, then USB to the computer. By default it'll capture all the VESA signals, but with the configuration tool, you can load custom timings and make your own adjustments (porch, vsync, hsync, etc.) to capture any analog signal you throw at it. I haven't found a single game or vintage computer that can't be captured.
I have actually implemented a VGA interface (output side) on an FPGA, and had mostly been testing with an LCD monitor. Timings are annoying, because if the aren't right the monitor tries but fails to identify the resolution, usually ending up with something that looks wonky. However, the LCD monitor does seem to accept some things which may not work out so well on a CRT, such as 1024x768@24Hz (frequency limited due mostly to the clock speed the logic is running at in the FPGA); though I am mostly using 640x480@60Hz timings but producing 320x200 or 640x400 output. Originally, the output looked pretty much horrible, as I was trying to compensate for a limited number of DAC bits (on the board) by PWM'ing the outputs, which the LCD really didn't like (produced lots of artifacts, which resembled ugly twitching/pulsating rainbows), however with only a few bits per component, the non-modulated output also looked pretty bad. Eventually had to tweak it so that the outputs only changed at the pixel-clock frequency (25MHz), and because neither PWM nor temporal dithering worked well with the LCD, eventually had to settle with a plain Bayer dither. The Bayer dithering works at least, but if one looks closely, there is a pretty obvious thatching pattern in the VGA output. ...
The VGA protocol is kind of a variant on the SCART and component video protocols, and it is possible to perform conversion among them using some simple analog circuitry.
How did that table make it through the video ??? I used to run a 21 inch monitor at stupid resolution at 85Hz and when we first plugged it in the picture was terrible, with everything having a shadow or echo on the screen. When I rang the supplier, they said "Oooops, sorry. You need this special cable." They sent me a cable that was standard VGA at the PC end, but a set of BNC connectors at the monitor end. With that, the picture was awesome, though it was used for network management rather than Quake. That was on the other work PC.
is there a good breakdown of how a HD video single is displayed; as HDMI is only one of a number of different kinds of cables that can carry an HD single. I found this video to be very informative.
Fascinating video. You have explained things that were relevant to me when I used CRTs and never understood. I'm glad some it is finally making sense and my past self can relax :-) Rocking the goatee too. Cheers
you used doom as your example when talking about the limitations of the horizontal sync and 320x200 resz at 70Hz, is that why doom had a frame limit of 35? or do you think that was coincidental?
Remembering the configuring Linux XFree86 the old way, keeping in mind the warning : bad parameters can fry your monitor ! About the Atari ST video output, too bad I'm much more a SW guy than an HW one. I'd like to connect mine to a VGA display. I know it's possible, it requires few components, but I've read it also needs a monitor that can handle "slow" signals, since the ST outputs low res compared to VGA.
I'm having a problem with a Dell P780 17" CRT reverting its brightness values too 100% after a KEYSTROKE. This happens after watching a video because of keyboard inactivity. It is very frustrating because it is a minor issue likely with a capacitor or chip but it obviously causes major visual disruptions. I can't remediate the issue with f.lux because the brightness level that it reverts to destroys the black levels, which is why most of us are using CRT's to begin with. Would isolating the unused VGA pins or changing the custom timing in Intel HD Graphics Center possibly help? Or is this a novel issue that only some retired technician somewhere out there who knows which capacitor needs replacing?
Ah, that Hyundai ImageQuest V770 monitor brings back memories. I used to have one when I was a kid (I'm only 18 now, lol) but one time when we moved house the built-in DE-15 cable got squished making the screen blurry. I miss it!
Im quite happy with my 1920x1200 LG VGA only LCD panel. I tried overclocking it but the logic board is an ass and will drop frames even at 61hz, and on top of it it doesnt even sync at 60hz, it syncs at 59.94hz as if it were a NTSC TV. It will say 60Hz everywhere but when you look at the actual mode setting it says 59.94hz xD
How old is that LG panel? To be struggling to go to higher frequencies it must be at the edge of whatever logic LG had at the time for the video circuit. I know the cinema displays were kneecapped in order to get near that size back in the early 2000s, some requiring dual-link DVI ports to drive at full resolution.
Can you get a VGA only monitor and try and damage it on camera by setting the wrong mode? This would be really interesting to see - particularly with the subsequent electronics diagnostics to see what failed.
Folks please help. I need to use vga video on a device that uses pure rgb with Csync. H and V sync combining is quite well documented. What i can't figure out is how to get the device outputting vga signal to detect that a display has been connected and start feeding signal in the first place. I've read that if you ground the ID pins according to the DDC combinations it'll trigger the device to start outputting signal. Still waiting to get pins to try it out though.
I believe I have an XGA display, then. Unfortunately, it no longer displays the color red, but I haven't figured out what the problem is, nor can I be bothered to figure it out.
Shelby you should work on some kind of Teleprompter device, Alec from Technology Connections made a video a while back of his setup ruclips.net/video/YeRu4xYH_W0/видео.html Not super annoying but IMO would greatly improve the quality of your scripted videos. Looking forward for the next one!
It's not complicated at all. It's just an extension of the old analog monochrome video standards introduced with television back in the old pre-NTSC days. Amplitude modulated signals coupled with sync to move the beam around. Old computers meant to work with TVs used exactly the same method except mixing the sync in with the amplitude video signal (plus colorburst crap overlayed on top for the chroma signal). VGA doesn't muck around with chroma at all, so understanding any VGA signal just involves recognizing the sync so you can align the analog lines properly.
I like that VGA's analogue nature means the max resolution is hypothetically infinite.
Yes, but most 32-bit systems weren't capable to pull of 4K (it was 2048x1536), and then by the time 64-bit systems become the norm, CRTs are already extinct.
24k?!
@@MakotoIchinose
Effectively, a modern VGA system should be capable of 1080p (1920x1080x16777216 colours). However, go back far enough and basic VGA chips would give you very basic results. For example, the standard in the 90s was 800x600x256 and in the 00s, 1024x768x65536.
@@anonUKI think space charge is the limit on CRTs. While with a laser scanner you can focus almost infinite power ( until you get second harmonics , like your bright red gets some violet glow ), you cannot focus charged electrons ( for the given current). Accept a dark picture and watch in a dark room. Or accept that this blurs the picture even more than the mask and that you cannot identify single pixels anymore. Can as well go interlaced as this doesn’t flicker.
CRTs show scanlines in the dark areas of a picture.
The cables however are made to a specification, and at a high enough frequency you'll have real trouble properly sending it through the cable. High frequency signals start to get really tricky to play nicely with.
7:50 OH MY GOD! I have a dial on my car dashboard that turns the brightness of the dash lights up and down. Every time I've used the dial over the past 5 years I got some really strong nostalgia and couldn't figure out why. It wasn't until now that I realize it's the same texture and feel of the dials on the bottom of my old family CRT that my dad would always get mad at me for messing with as a kid 😂
the wobbly table is giving me anxiety. great video though , learned a lot.
i thought i was just too drunk to see.. lol
Especially with that HP 16500 (that bitch is heavy). I've got mine sitting in a 19" rack.
@@BlackEpyon I noticed the HP16500A as well. You've got to be a special kind of guy to appreciate a 1650x series LA. I upgraded to a 1670x and 1690x.
My mistake, it's a 16500C.
@@MrWaalkman I think the 'C" can take a PS2 keyboard/mouse. A & B have a proprietary HP RJ-style keyboard/mouse connector. Mine is a 'B" model.
Oh god I could rant about TVs which force overscan on HDMI input for an hour ...
How does overscan work on HDMI? Do they have part of the LCD panel hidden in the bezel, or do they have an LCD panel with a slightly lower resolution but marketed as being 1080p.
I'm curious because I have a 1080p LCD tv and was kind of annoyed when I found out that 10% of the screen was cropped-out.
Yeah, my new LG tv with ONLY hdmi inputs has it on by default on all inputs... MADNESS
@@thepuzzlemaster64 It works in the worst possible way you could imagine. The physical display will actually be 1920x1080 (or whatever resolution is claimed), but the overscan will scale the input pixels so you get scaling effects and reduced de facto resolution. There is no pixel-for-pixel option.
I learned of this awfulness when I hooked up a computer to a cheap Polaroid brand HDTV. When hooked up to HDMI input, the edges were cropped out (annoying but compensatable) and the text was always crummy looking (unacceptable). But there was a VGA input also. When I hooked it up via VGA, there was a menu option for pixel-for-pixel. No cropping. Sharp crisp text, albeit with analog-digital conversion fuzziness.
To add insult to injury, the HDMI input actually DID allow pixel-for-pixel output if the input signal was less than 1080p. If I set the resolution to 1024x768 on HDMI, the display was razor sharp ... but limited to a 1024x768 box with huge black borders all around.
Ugh. Absolutely despicable.
I now have a cheap SCEPTRE 4K TV which I use as a computer monitor. I knew from reviews that it does NOT overscan at 4K, or any resolution other than 1080p. But for 1080p? It ONLY allows overscan. This is slightly better than that damn Polaroid, because the pixel density of 4K practically eliminates the bad scaling effects. But it still would have been a deal killer if it weren't for the fact that I knew I would never use this display with 1080p resolution.
Grrrr....
@@IsaacKuo Oh so like how when you plug a computer into a TV it has some stupid 1300x1400 or something resolution.
@@desertfish74 well i don't know, maybe it's a region thing? I'm from europe
A relatively interesting fact is that even though HDMI is a digital video standard, its video signalling a direct descendant of DVI, and DVI retains the synchronization pattern from VGA with HSYNC and VSYNC surviving as two control bits. DVI keeps the front and back porches alive and HDMI even makes use of them by optionally encoding audio and auxiliary data signals into blanking intervals. (The funniest part of the spec is the sentence that goes roughly like "each data packet is 32 pixels long")
I suspect that DVI was designed to be implemented by replacing the DAC with a high-speed serial transmitter and leaving the rest of the video chip intact (knowing the hardware people, I don't think it's too far from the truth) and the spec is explicitly permissive of pixel timings, so we're lucky that modern displays work at all with all that legacy.
Don't forget that DVI does have an analog mode compatible with VGA. (It's not always implemented, but it's there in the specification.)
I feel so inadequate watching this highly excellent technical video on the old VGA standard... Please continue making these videos where I can feel small and learn stuff...
I like the way you put that. Feeling opted to learn is a great feeling.
But VGA is a fantastic video standard, like he explained in the vid, it is much better than the digital standards used today mainly because of the lag they produce, VGA has almost no imperceptible lag, it's amazing.
Notice the VGA's 31KHz horizontal sync rate is almost exactly double the 15KHz NTSC horizontal sync rate, and the original CGA/EGA 200-line modes were 15KHz NTSC compatible modes.
Technology Connections youtube channel said so many times that there's no defined resolution for analog signal. It's just artificially divided into lines.
VGA adapter has just two modes 400 and 480 lines. Many cards don't even display 350 line mode, but use 400 line mode drawing 50 lines black.
Even though VGA adapter is complicated hardware mess with tons of registers that can set a lot of different modes, almost all of them use 400 or 480 lines. VGA has unique 'double scanlines' in low-res 200 and 240 lines modes, because hardware displays each line twice.
Base VGA cards are also able to display 600 line modes in 56 Hz. Windows 3.x SVGA 800x600 16-color driver will work on any VGA card, but monitor capable of 600 line modes might be required.
VGA cards can display various modes at 200, 240, 300, 400, 480 and 600 lines. Some games also support 224 and 256 line modes, but my (S)VGA cards synchronize display to higher line counts. Horizontal resolution can be really anything. It's just analog signal. It can be as low as 240 and as high as 800. I've encountered software that used: 240, 256, 296, 320,360,400 horizontal resolution. DOS Quake shows this really well offering horizontal resolution of 320 and 360 pixels at 200, 240, 350, 400 and 480 vertical lines. These are all basic VGA modes. Highest 256-color VGA mode is 400x600.
Ironically VESA modes aren't that flexible, but VESA is just simpler and better. VGA is limited to 20-bit memory addressing of x86 real mode. Which maps only 64 KB window of video memory above base 640 KB. VESA compatible cards have newer 32-bit memory address mode, where linear frame buffer is located at the end of 4 GB address space.
However when VESA was finally getting popular it was killed by Windows 95 and graphics drivers. VESA never had a chance to evolve into dedicated low-level graphics API with increasingly more advanced features, maybe OpenGL or hardware 3D support in general being added. All GPUs are still VGA/VESA compatible, but just as a way for OSes to display GUI on every GPU without specific drivers.
One day I connected a 32 inch LCD TV to an old PC (Pentium I or III, don't remember which) with an S3 video card through a VGA cable, started some DOS-based hardware identification software (hwinfo maybe) and it showed information about the TV (maker, type, size, a few things). I hope you will talk about this identification protocol, I did not know about it.
EDID. Basically there's an i2c line in the connector and you expect that a TV has an i2c EEPROM connected there directly or a device that can behave like one. It simply being an i2c chip is advantageous because that can be powered by the cable, otherwise the monitor or TV would need to be turned on before it can be identified.
what brand was the set?
thats interesting.
I just bought a preowned sony bravo 40" tv yesterday and it too has a vga socket on the back.
I wonder if my pc would be able to pull some info off it if I did the same.
Its 1080p
@@coolelectronics1759any remotely modern screen/adapter with VGA input/output (or DVI, DP or HDMI for that matter) will support EDID in one form or another. The 1.0 standard goes all the way back to 1994.
9:55 "and why HDMI Overscan is stupid"
I have the same emotions...
Little secret of HDMI: The 'image' that HDMI sends raw is actually a fair bit larger than the picture - and those extra pixels carry the audio information.
It's designed that way because it simplifies the electronics at intermediate stages. The device generating the signal and and device receiving it need to deal with packing and unpacking the audio, but everything in between just needs to pass three differential signals, clocked. No framing to worry about. Can ignore the 8B10B encoding. As far as everything in between the very start and very end of the signal chain is concerned, it's just three streams of bits.
A noteworthy omission is that none of those extra pixels are allocated for closed caption data.
Horizontal and vertical blanking and synchronization information is contained beyond the active portion of the video signal. For a progressive 60 Hz 1920x1080 signal, for example, the total signal containing horizontal and vertical sync data extends the overall "image" to 2200x1125, with 1080 active vertical lines of horizontal resolution and 1920 vertical lines of horizontal resolution. All sorts of data can be stored in those non-active lines containing sync pulses, but some must remain so the display knows the end and beginning of each succeeding frame or field. No display is capable of running without some blanking or sync data, CRT, LCD, plasma, OLED, or otherwise.
@@NUCLEARARMAMENT That's not what HDMI overscan is. It's the source that add black pixel *inside* of the 1920×1080 window because the TV want to enlarge that digital picture.
This is a great presentation, Shelby! The quality content just keeps getting better and better on this channel :)
I'm perplexed about the last bit. Sure the high frequency signals needed for high-res/high-refresh modes certainly needed good quality cables, but what do you mean you don't know if it's worth it? Not only was it worth it, but it was absolutely necessary. Good quality CRT monitors with low-persistence phosphors at 60hz (and at high resolutions) are headache inducing after a few minutes, nobody I know would ever run a CRT at anything lower than 75hz, and for me 85 and up was extremely desirable. And given good quality cables and connectors, the picture quality is excellent, no degradation at all. I have no idea how they managed that, it seems like black magic to me, but they absolutely did.
The concise script and clear explanation is REALLY appreciated. Thorough yet still entertainaing content.
I've been looking at various different VGA and composite signal drivers from old retro systems like arcade boards, commodores, game consoles, and many docs would mention the "blanking" period, and show it as just this time period to do nothing. But you are the first to really describe as to what that timing actually is, and why it exists. A very eureka moment for me. Thank you for your content!
the crt hue changing through out the video really tricked me
"i couldve sworn the colours were different"
great video!
Multimode CRTs existed before the VGA video standard. NEC was marketing their original MultiSync and MultiSync II already when the PS/2 showed up. They later included a HD-15 to DE-9 adapter to use the monitor with the fancy new video standard. VGA could also output 15khz video to NTSC or CGA timings with some register poking.
Regarding high refresh CRTs, the bigger issue is usually the quality of the video card's onboard DAC. Most outputted muddy blurry video at higher refresh rates. Hook up a card with a quality DAC like a Matrox, and you clearly see the difference. Whats odd is that video card vendors never really marketed this, just the resolution, color, and refresh rates!
I don’t get how the amplifier in the CRT is not always the limiting factor. More difficult to amplify to 60 V. How did they even shield this to avoid radio emission?
I was born watching 4:3 CRTs,
Molded by them,
I didn't own a Widescreen LCD TV until I was a man!
thats crazy I just bought preowned sony bravios 1080p 40" lcd yesterday with remote and stand for only $80!
If I remove the degauss function will you die?
A really great explanation of a very confusing topic, although I kept get distracted by the wrong color and color changing color bars on the monitors.
Looking forward to the next installment!
This is my favourite channel on youtube. Keep them coming.
So VGA is Video Graphics Array.
Then SVGA is Super Video Graphics Array.
Then XGA is *Xtreme* Graphics Array?
jk
Jokes aside, iirc the X was for eXtended, but im sure someone bought some 24k gold monster XGA cables thinking the X was for eXtreme Graphics and that it would make his low res textures look better and double the game's poly count.
Then you have SXGA, Super eXtended Graphics Array (iirc 1280x1024) and so on... VGA keeps going since 1987 lol
1024x768 resolution is also known as 8514/A, named after the model number of the first IBM video card to support it (introduced with the PS/2 line in 1987). It was replaced by the XGA (eXtended Graphics Array) card in 1990 and then by the XGA-2 card in 1992.
Pretty much what people know as VGA is down to it using the same cable, each new standard offered more than the last but still had the same name and the same connector, it would probably not caught on if it weren't for that and would've died as quickly as EGA did.
It's even managed to outlive a few of it's successors (DVI, ADC, Mini DP/Dell's USB video port, etc), still used as standard for cash registers, projectors and server displays. Next to the x86 architecture, it's probably one of the oldest PC standards still in use today.
It got sillier - by the time VGA was starting to be displaced, it was up to WSXGA+.
Widescreen Super Extended Graphics Array... Plus.
Love the beard. It makes you look like a Musketeer .
It's a goatee, or at least it's aspiring to be, with that gap between the mustache and beard.
I agree, works better with the long hair
Yup
My thought is that this is the Shelby from Star Trek's mirror mirror universe
He looks like druaga1 with that goatee
I made a device for a visually-impaired friend, a matrix switchbox. A charge pump to turn the +5V available off the VGA port into +-5V, which drove a high-frequency three-channel fixed-gain amplifier (AD8075), then a set of three 1P4T switches with 75Ω resistors. Pass all the sensing pins and sync straight through. A handy little box with three switches that let you manipulate the three color channels in VGA - swap them around, remove one, duplicate one onto two or three.
Have trouble on games because you are red-green color blind? Swap the green and blue channels, and those red/green meters turn into red/blue meters you can see more easily.
Doing the same on HDMI would require far, far greater electronics skill - you'd need to program an FPGA. But on analog I could not only do all that on analog processing, I did it almost entirely on stripboard too - the only SMD part I needed was a tiny carrier board for the amplifier. Did have to keep all the wires as short as I could though, at that sort of frequency. I was really surprised that there was no visible loss of quality, even at the highest resolution my monitor could handle.
That's awesome of you to help out your friend like that!
I wish they'd employ people like you when designing cars. My friend's digital dashboard has a red line on a green overlay for his speedometer, but he's red-green colourblind, so he literally can't see how fast he's going!
I've installed LineageOS on my Samsung Note 3 and it has display settings allowing me to adjust colours and help compensate for colour-blindness. Just thought I'd let you know this is available if your friend struggles with smartphones.
Also lots of new cars use extremely thin strips of LEDs circling the headlights and brake lights for their turn signals, and for anybody with astigmatism, thick glasses, or poor colour recognition, the dull yellow LEDs get completely swamped by the much brighter lights next to them. It's like if you asked somebody to design the absolute worst way you could implement a vital safely feature, that's exactly what they use now. They never once considered people who don't have perfect vision, who make up 20-40% of everybody if you include just colour-blindness and wearing glasses.
@@Microwave_Dave Oh, here: ruclips.net/video/8HBHJB0lAL0/видео.html
I've already described the circuit in sufficient detail that anyone with a reasonable skill level could build it. But if you have use for such a thing, I could draw up a schematic such that anyone who knows which end of a soldering iron to hold could build one. It's remarkably simple to construct, though VGA is becoming a rarer connection to encounter every year now.
Never thought I would say this, but that is a fantastic thumbnail! The video is great too, your channel is clearly evolving in a very good way, keep up the good work!
Wow, brilliant video Shelby. I never really put much thought into the intricacies of VGA standards but this video sure changed that. Thanks for the detailed explanation
I remember when the NEC multi-sync monitors came out. Those things were game changers.
I also remember when i bought my first PC clone, and had to read the monitor instructions on how to set the video card sync polarities to get it to properly work. Good times. I still have that machine.
I never really appreciated the wican voodoo in how problem free (for a consumer of VGA) the whole process is.
Learning more from your fantastic production has left me in deep introspection while watching a progress bar crawl along with stunning clarity on a VGA NEC AccuSync LCD71v.
I feel the universe has a higher error rate drawing the room around the NEC monitor. A miss scanned draw line now and then not only seems acceptable after watching... But almost unacceptable an error seemingly never happens. Your video has made me appreciate this NEC display.
I have used some janky displays and own a bunch too. I just tried using an off brand TV as a Hdmi display and the overscan was more than an 80's sears TV console. I did not know what was going on till you mentioned the phenomenon. Anything that needed overscan is [likely] 4:3, so... I feel your plight.
The scaling on this cheap overscan Hdmi TV/moniter is so horrendous, text is unreadable. The same TV is stunning with a VGA cable.
This production has made me appreciate this NEC, there is no appreciable delay in the mode changes, and the image is perfectly centered and in phase, and faster than most tube monitors I encountered. Many of my other displays need the image for a moment, and then a correction of up to inches is made. Fun fact, this mentioned Hdmi display (for a while) replaced the NEC on my 32 bit machine (both connected via VGA) When booting up during the change from the bios to Linux splash screen, the NEC is seemless, the mentioned Hdmi TV display manifests the Mint logo half off the right side of the screen, and moments later is attempted to be centered. Because the backround and border of the splash screen is black, the display has no reference and never quite hits a proper center. It's cute. It will not center your desktop if the backround has black edges either.
How a non electron gun display handles something like a coax input or composite can vary wildly. I feel there was a purposeful implementation of poor quality handling of these inputs by certain manufactures to push new media sales.
Example: two 32" LCD (band a, brand c) tvs playing same VHS, same VCR, composite output. One looks great, one is awful. I would think there be no appreciable difference.
Set top DVD recorders had the same wide swing in quality among brands. My Sony on a 3hr burn is okay quality at best. The same quality is present on my no name burners on a 6hr setting. But the Sony will push the VHS out its HDMI output seemingly upscaled. (if you burn a tape on the Sony, the encoder rears it's head. So you play the tape on the Sony, while burning that output with the no name burner, lol. [my cyber home DVD burner ignores macrovision signals and pirates the Divix encoder that bankrupted the company] )
My best means of analog capture I have found was the little boxes you put in line of the VGA, and S-video into a set top burner I mentioned, and ripping the DVD. The in line box will alow you to set size, Pal/ntsc, color, zoom, brightness, etc... I use mine for a hitachi 72" projection TV and I can read the standard system font at 1024x768. It's not laser clear but the TV has like 800 scan lines. I would not think there be enough to scan back up but it looks pretty good. Back in the day it would be stunning.
I look forward to the rest of the series.
Happy Day!
High refresh is one of the only reasons I'd keep a CRT PC monitor in my room lol
What about the beautiful picture and inky colours and black, zero motion blur, zero lag, no fixed or native resolution so looks good at even low resolutions, even OLED has bad motion response, and a fixed native resolution, not to mention no OLED monitors available yet, I will admit when 24/27/32 inch sized OLED's with 120Mhz refresh rates, improved motion response are available at reasonable prices, say starting at £$300 for the 24", £$500 for the 27" and £$800 or so for the 32", then OLED will be a good alternative to CRT monitors. Until then there are CRTs's that can do 160Hz refresh rates, go up to 2048x1536 @ 90Hz on a CRT is a beautiful thing I can tell you, nothing comes close to that 3 dimensional beautiful looking bright and colourful image that a CRT has, the image they produce has so much depth which makes gaming so much more fun and immersive, my gaming 32" 144Hz VA G-Sync monitor just looks so flat and dull compared to a CRT, they have tried now for 15 years to get close to CRT with LED, Quantum Dots, HDR, 340Hz+ refresh rates, dual panel LCD's (best version of LCD), multi layer LCD's, and none of has even come remotely close to CRT, and it never will. Even OLED has its limits, maybe LCOS or Laser Phosphor will manage to finally knock CRT of its pedestal, seeing as it's true successor FED/SED technology was shutdown, we would have 85" CRT quality flat screens by now that would also have gotten cheap by this point too, ah well, maybe in 10 years we will finally have something worth buying, maybe even CRT's will be manufactured again.
but they are fat and sucky and with lead and radiation
if only they were slimmer
@@coolelectronics1759 So what if their big and bulky lol, I understand if you only have a small place with not much room, but I will take a big CRT over a cheaply made paper thin LCD any day. Modern CRTs don't have lead in them but even if the di so what, it's in the glass, also radiation? lol it's a tiny amount of no ionizing radiation and is harmless, CRTs are much better for your eyes too, slimmer = cheap and nasty. Although there were some very low profile slim CRT's from Samsung that were pretty good.
@@Wobble2007 gotcha understandable.
I was just under the impression the crt was just considered nothing more than obsolete junk at this point, as noone would take the ones I had for sale off my hands not even for free, even sony ones the suposedly nice one, and I had to toss all of them unfortunately. Its such a shame but I gues there is somewhat of a demographic for them still you just have to know how to move them no pun intended lol. I sold a typewriter once, last year! Sometimes you really have to sit on these things for them to sell eventially same said for other hard-to-sel items-Telephones, printers, grampa floor TVs, pianos, fax machines, scanners and VCRs.
I myself dont have the thinnest lcd either, I payed nothing for it it is a small (Hansin?) 19" monitor and its good enough for me.
For CRTs, I gues if an oscilloscope counts, I have two of them one in front of me and one in my tech lab.
@@coolelectronics1759 Bloody hell mate, you would've made a small fortune on the CRTs now, a simple 17" run-of-the-mill CRT is going for 100£$ plus on eBay now, that said I wouldn't personally pay that much for a 17", the SONY 24" FW-900 is in the thousands now, up to 7 grand, I'm trying to find one for a grand or less, there are bargains to be had still, out of interest what was the best CRT you threw away, makes me sad to even think about tbh.
Excellent video. The thumbnail depicts a perfect visualization of "video graphics array," and the presentation was spot on. i agree, staying away from old VGA monitors is wise unless a person highly desires a specific monitor made for or built during a computers timeline. Even then it would hardly be worth it. Though I do remember when flat panels came out they were the absolute thing to have and no one seemed to complain about latency with legacy games at the time, I guess we were too dazzled by the insane new monitors to really care.
I have the opposite experience... No one around me would have bought a rubbish's "flat screen"... And I used CRT for LONG, just because of latency, and general image quality...
That's a very wobbly table.
fantastic video, and yes VGA is much more complicated than i tought, but that wobbly table was driving me insane
Excellent presentation Shelby. Thank you! I remember seeing my first VGA image on an IBM monitor driven by an IBM Model 30-286 PS/2 ( the "30" was the size of the HDD in MBytes of course ) in 1987 at work. I had only experienced EGA before this, so it was like looking at a hi-res ( at the time ) photograph! It was an IBM stock image of a wharf with fishing boats and mooring lines. Such fine detail blew me away!
3-5 Video series? Sign me the fuck up!
I'm looking forward to this series. I'm kinda hoping you also touch on arcade video standards as well.
That mic looks and sounds good!
Loving the goatee. And I've been excited for this for awhile!
clicking like before I even watched ! you put so much work into this
The nicest monitors I have ever seen are Mitsubishi Trinatron multi-sync monitors. If you ever see one in the wild, buy it. I have worked with literally hundreds of different CRTs over a 20 year period as an IT professional. Everything from early IBM true VGA monitors to 21 inch behemoths from Compaq and Apple. Nothing even comes close to the Mitsubishi. The NEC multi-sync monitors are good too, but not in the same league with the Mitsubishi.
Apple monitors are generally good too (though not all. there are some cheap poorly built Apple monitors). But they do lose some of their good properties when hooked to a PC through an adapter if early ones or directly with latter ones with the VGA connector.
You've worked on hundreds of CRTs but don't know that Trinitron is Sony and Diamondtron is Mitsubishi? If you're referring the the Megaviews, those are dotmask anyways, not aperture grille...
One nice thing today is that there exist DACs and ADCs fast enough that you don't need to generate a mode-specific clock even at crazy high resolutions and refresh rates.
With those really high rate display modes, the converter can be wired directly to the female D-sub connector from the board, then you don't need to worry about interference and cable quality.
It's hard to believe so much knowledge and information is inside your head.
Excellent vid.
Bender Rodriguez is a nice added bonus.
Also the old VGA was a cheap hack. The 480 lines come from NTSC, 480i for the visible image. What VGA did is to create a 60p version out of it, meaning just doubling the vertical frequency of a monitor and sending line after line (no interlace). So you could use TV-similar circuits to produce VGA monitors (the stuff after the NTSC decoder to R/G/B).
Some VGA monitors even synced as well to an R/G/B 480i signal, output by an external TV tuner for example.
It is unfair how such an interesting and detailed video gets so little views. Kudos to the author!
This explains why the picture is often shifted... thanks.
Good luck for 100,000 subscriber
I actually run 15" CRT in 1024*768. It worked, but won't go up 43Hz Interlaced, or something like so, which was incredibly useless. Also, that ridiculously high-speed CRTs? Good luck with finding video card working with those five BNC cables interface :)
@2:49 nitpicking: EGA was maximum of 64 colors, due to the 6 bit resolution, not 256 colors.
Yes, and even those were just the _palette_ colours. 64 was NOT the number of _simultaneously displayable_ colours, which was limited even further, to a choice of 16 (from the palette). This was due to EGA only working with 4 bits when it came to the actual image to be displayed. The monitor with its 6 colour signal lines did support 64 colours, the standard EGA card did not, a design choice that may have been related to video RAM limitations.
@Tech Tangents: Do you have any explanation or excuse for saying 256 colours there? Was it just a silly mistake, or what was the thinking behind that?
@@ropersonline chill. :) I think it’s just a simple mistake. With some tricky programming you could actually get all 64 colors on screen at once. But that wasn’t really usable for games.
@@root42 Tone policing people isn't very chill yourself. Also, compulsively answering the part specifically directed at Tech Tangents exhibits the same unchill qualities and isn't a good look. So, chill? Pot, kettle, black. I would be curious to find out more about those 64 colour hacks though if anyone has any information. Especially if that was possible on original IBM EGA hardware, with or without a memory upgrade. It would depend on the resolution, at least.
@@ropersonline one game that used more than 16 colors is the stat screen of Lemmings. The game switches palettes halfway through the screen to get 32 colors. In the same manner you can more or less switch palettes every scanline. Similar to the Amiga HAM mode. I think I saw a demo where someone made an image viewer that worked like that, but can’t find the link right now.
@@root42 Stat screen or start screen? I'd love to see a screenshot of that tweak in action. The 640x350 EGA start screen screenshot I found at mobygames only has 16 colours. Some VOGONS info I've just googled also seems to suggest Lemmings even switched palettes in 320x200 mode, which is weird, because on original EGA hardware you couldn't use non-CGA colours in 200-line mode, for reasons of CGA monitor backwards compatibility. The 64-colour palette was for the 350-line mode only. Perhaps that "EGA" 320x200 palette switching only occurred on odd EGA clones or VGA hardware. VGA was supported in its own right though, so any VGA owner would have been better off running that.
Cards in teaser: Datapath VisionRGB E1S, Epiphan DVI2PCIe and Avermedia Broadcaster HD. Personally I use VisionRGB with VCS (a third-party software made specifically for Datapath cards).
The last widely available samsung CRTs could handle extreme resolutions, but were of the shadowmask type and came bundled with a very thin cable. Another point of mismatch occurs at the connector of the video adapter. A DVI with an adapter might actually give a clearer signal. But if it is not working well, it's just a light ghosting echo, often masked by the pattern of the picture tube.
I remember that I had a Nvidia card that supported 100 hz(I don't remember the resolution that it supported in that frequency) and a Philips monitor that supported it, while the image wasn't at the same quality(bleeding, chromatic aberration and more) it was cool seeing the mouse and windows moving way smoothier than normal
I literally just watched a video about VGA for no reason and for some reason I dont regret it
You Answered so many questions I've had over the years.
I wonder if someone has ever built a box to fix all these problems
Looking forward to the VGA capture video!
Have you looked in to ways of tapping directly into the image signal before it gets sent to the DAC like what RGB mods for retro consoles do? I'm just curious if it's been done before.
"...tapping directly into the image signal before it gets sent to the DAC..."
I suppose it's possible, but that might be limited to some of the earlier VGA cards. From the early to mid 90's, a lot of VGA chipsets began putting the DACs on the same die. This was the 512k or 1M 16bit ISA cards that might do 1024*768 @ 256 colors max.
Other options:
DSPs on each of the FIVE outputs (R, G, B, H, and V) would get expensive real quick at higher resolutions and frequencies.
There's also reading (snooping) the VRAM, processing it digitally, and outputting it another way. Trying to read the RAM while the VGA chipset is trying to read and write to it might be a no-go. Emulating the VRAM with something more modern and faster might do it.
An FPGA implementation might work instead of doing a ton of reverse engineering. Signals could be manipulated digitally or some gates could be used as DSPs.
@@anomaly95I like how the Atari Jaguar has the R2R ladder soldered to one side of the CMOS chip “Tom”. I have not seen resistors in CMOS. How do you approach your fab about this? OpAmps are huge as are charge pumps.
I've known people who can't tell the difference between 60hz and 85hz, but I could, for better or worse. It gave me eyestrain, particularly when we were at the later stage of CRTs with higher resolutions. LCDs don't bother me at 60hz though. I've spent hours in front of monitors every day for decades, both for work and play, and despite my nostalgia for older technology, I wouldn't want to go back.
I enjoyed this presentation very much, I watched it twice.
Whoa! It's Shelby's evil twin!
🤣
I've had great success with my Epiphan av.io HD capture box. At US $429 retail, it's not cheap, so might not be a good solution for hobbyists. But you can buy refurbished and used ones for half that (I got mine for $175.) You connect a DE-15 cable directly to the box, then USB to the computer. By default it'll capture all the VESA signals, but with the configuration tool, you can load custom timings and make your own adjustments (porch, vsync, hsync, etc.) to capture any analog signal you throw at it. I haven't found a single game or vintage computer that can't be captured.
I have actually implemented a VGA interface (output side) on an FPGA, and had mostly been testing with an LCD monitor. Timings are annoying, because if the aren't right the monitor tries but fails to identify the resolution, usually ending up with something that looks wonky. However, the LCD monitor does seem to accept some things which may not work out so well on a CRT, such as 1024x768@24Hz (frequency limited due mostly to the clock speed the logic is running at in the FPGA); though I am mostly using 640x480@60Hz timings but producing 320x200 or 640x400 output.
Originally, the output looked pretty much horrible, as I was trying to compensate for a limited number of DAC bits (on the board) by PWM'ing the outputs, which the LCD really didn't like (produced lots of artifacts, which resembled ugly twitching/pulsating rainbows), however with only a few bits per component, the non-modulated output also looked pretty bad. Eventually had to tweak it so that the outputs only changed at the pixel-clock frequency (25MHz), and because neither PWM nor temporal dithering worked well with the LCD, eventually had to settle with a plain Bayer dither.
The Bayer dithering works at least, but if one looks closely, there is a pretty obvious thatching pattern in the VGA output.
...
Woah, it's Evil Tech Tangents! I shall call you Flexo.
Heck YEAH! VGA still rules, and now i know how it works :D
this is fantastically informative and I'm looking forward to more
A N I M E
N
I
M
E
The VGA protocol is kind of a variant on the SCART and component video protocols, and it is possible to perform conversion among them using some simple analog circuitry.
How did that table make it through the video ???
I used to run a 21 inch monitor at stupid resolution at 85Hz and when we first plugged it in the picture was terrible, with everything having a shadow or echo on the screen. When I rang the supplier, they said "Oooops, sorry. You need this special cable." They sent me a cable that was standard VGA at the PC end, but a set of BNC connectors at the monitor end. With that, the picture was awesome, though it was used for network management rather than Quake. That was on the other work PC.
Good CRT needs good cable. Particulary, when you go above 1024x768 x 75 Hz resolution.
is there a good breakdown of how a HD video single is displayed; as HDMI is only one of a number of different kinds of cables that can carry an HD single.
I found this video to be very informative.
Fascinating video. You have explained things that were relevant to me when I used CRTs and never understood. I'm glad some it is finally making sense and my past self can relax :-) Rocking the goatee too. Cheers
Man!, this video is great. I knew a lot but not consciously.
Awesome video! Very informative info that helps me better understand VGA.
you used doom as your example when talking about the limitations of the horizontal sync and 320x200 resz at 70Hz, is that why doom had a frame limit of 35? or do you think that was coincidental?
Hey, did the follow up videos ever materialize? I have my own setup for VGA capture, but I'm always interested in seeing how other people do it.
Remembering the configuring Linux XFree86 the old way, keeping in mind the warning : bad parameters can fry your monitor !
About the Atari ST video output, too bad I'm much more a SW guy than an HW one. I'd like to connect mine to a VGA display. I know it's possible, it requires few components, but I've read it also needs a monitor that can handle "slow" signals, since the ST outputs low res compared to VGA.
What microphone do you use? It looks neat
I'm having a problem with a Dell P780 17" CRT reverting its brightness values too 100% after a KEYSTROKE. This happens after watching a video because of keyboard inactivity. It is very frustrating because it is a minor issue likely with a capacitor or chip but it obviously causes major visual disruptions. I can't remediate the issue with f.lux because the brightness level that it reverts to destroys the black levels, which is why most of us are using CRT's to begin with. Would isolating the unused VGA pins or changing the custom timing in Intel HD Graphics Center possibly help? Or is this a novel issue that only some retired technician somewhere out there who knows which capacitor needs replacing?
This video was very cool and enjoyable, but I have to know where you get such an awesome hat!
thinking cap!
All this aside - nice job syncing your camera, I didn’t see a SINGLE flicker on those things the whole intro.
need help to conect ps2 to vga monitor with ypbpr signal to have full resolution from ps2
Hello, I have a question that how can I play HDMI video in SVGA mono monitor?
I'm calling in sick to work so I can watch this whole video. I really enjoy these videos. Please keep it up.
great video man but if you're going to read on camera will you at least look at the lens? :) Thank you for your work!
What about Amigas and Atari ST that kind of supports
VGA, but on only half of the horizontal refresh rate?
VGA is more exciting than I thought
I want the little CRT monitor now. What size is it?
Ah, that Hyundai ImageQuest V770 monitor brings back memories. I used to have one when I was a kid (I'm only 18 now, lol) but one time when we moved house the built-in DE-15 cable got squished making the screen blurry. I miss it!
So... did you ever do that updated video... or ???
Another upload? Yay! 😀
i use my 2003 Dell CRT at1400 by 1050 or 1600 by 900 at 150hz interlaced. feels great and looks great for modern gaming let alone vintage.
I haven't ever subbed inside a video. Very good video even for me who has quite a knowledge of electrical enginering
Im quite happy with my 1920x1200 LG VGA only LCD panel. I tried overclocking it but the logic board is an ass and will drop frames even at 61hz, and on top of it it doesnt even sync at 60hz, it syncs at 59.94hz as if it were a NTSC TV. It will say 60Hz everywhere but when you look at the actual mode setting it says 59.94hz xD
How old is that LG panel? To be struggling to go to higher frequencies it must be at the edge of whatever logic LG had at the time for the video circuit. I know the cinema displays were kneecapped in order to get near that size back in the early 2000s, some requiring dual-link DVI ports to drive at full resolution.
well made video. love the details. it shows a strong skill set. great stuff.
you gave answers to questions i didnt even know i had Oo
Whoa! That's a tsunami of information and you barley took a breath. Thanks for the info-I think?
Nice new series
Can you get a VGA only monitor and try and damage it on camera by setting the wrong mode? This would be really interesting to see - particularly with the subsequent electronics diagnostics to see what failed.
Folks please help. I need to use vga video on a device that uses pure rgb with Csync. H and V sync combining is quite well documented. What i can't figure out is how to get the device outputting vga signal to detect that a display has been connected and start feeding signal in the first place. I've read that if you ground the ID pins according to the DDC combinations it'll trigger the device to start outputting signal. Still waiting to get pins to try it out though.
This makes me miss my Iiyama Vision Master Pro 450 19" monitor I had. Extremely colour accurate I still beat myself for giving it away...
so my monitor is emulating a crt beam if its lcd?
This video only went into old fatscreen CRTs that I dont even use so I am lost lol.
Ugh, this wobbly table gives me anxiety, please fix. Otherwise, great and informational video!
I believe I have an XGA display, then. Unfortunately, it no longer displays the color red, but I haven't figured out what the problem is, nor can I be bothered to figure it out.
Shelby you should work on some kind of Teleprompter device, Alec from Technology Connections made a video a while back of his setup ruclips.net/video/YeRu4xYH_W0/видео.html
Not super annoying but IMO would greatly improve the quality of your scripted videos. Looking forward for the next one!
It's not complicated at all. It's just an extension of the old analog monochrome video standards introduced with television back in the old pre-NTSC days. Amplitude modulated signals coupled with sync to move the beam around. Old computers meant to work with TVs used exactly the same method except mixing the sync in with the amplitude video signal (plus colorburst crap overlayed on top for the chroma signal). VGA doesn't muck around with chroma at all, so understanding any VGA signal just involves recognizing the sync so you can align the analog lines properly.
This video was very informative.
Thank you.