This is an older video I had saved as a backup for a while. It was done in July before I got my current microphone and before I tried to do color processing on my videos. For reasons that will be clear soon it make sense to just release it now.
Okay, so here's the oral history from a video greybeard. This was from when I was doing industrial video, and this is how this was actually used. First, I need to tell you this story to to tell you the other story. You have to remember that this is the era of linear, tape-to-tape editing. That is who this is aimed at. You had two main types of editing, called on-line and off-line. (Nothing to do with networking as we know it.) On-line editing had big tape rooms of synchronized machines and computerized editing systems that could roll two machines at once (A-B Roll Editing) or even more (A-B-C etc). Those would let you do wipes/dissolves/FX from two video sources. On-line was big, expensive, and aimed at broadcasters. This product was not aimed at that market -- this was aimed at offline editors, guys doing training videos and wedding videos -- usually called Industrial Video. Offline editing was two machines -- source and master. You never had a way to have two video sources synced and going at once. That's why this wasn't made for chromakey -- chromakey requires two sources. (You have luma-key on it, and it was used to solarizing type FX.) Instead, this was used to sorta "fake" having an A-B roll -- You would wipe to a color (usually black), stop your edit, begin your next exit in black, and wipe out to your source. (Most modern computer editors still have this effect under "dip to black" or fade to black, which is what we called it then.) In this context, the other things make a little more sense. The mic input is so that the guy making the lame training video for MegaCorp Industries can do his voiceover while the video is being edited (yeah, you did that sort of thing live, like Bill O'riely) and the color bars were both for setting up your monitor and laying color bars at the start of your master so that the next guy can set up his system to match yours. The main thing that surprises me is that there is no GPI trigger on it that I can find. You would normally expect something that can do these wipes to at least have a trigger input, but I supposed in 1991, you still expected to still do them manually. (My timing is still pretty good just from having to hit things as close to frame-accurate as I could in projects using gear like this.) A GPI trigger would let your edit controller tell it when to take the FX. On higher end Sony gear, there would be RS-442 or even a direct sony-specific connection to the sony edit controller. This is the sort of gear you would put into your signal path when editing with an RM-450 (the offline editing gold standard) although it looks like Sony intended this for something like the RM-E700 controller. FWIW, I was still occasionally using an RM-450 into 2010 or so to control a source deck for encoding old tapes to MPEG.
One more thing I thought of -- later in the 90s, you would see this sort of product include a frame-store (prohibitively expensive in the early 90s) and you would get exactly the sort of functionality you expected. You would play your source, his a button to grab the frame, and then you could wipe between the live video and the framestore. If you watch closely on cheap late-90s TV, you'll see this trick being used: you take a frame store of the first frame of your NEXT edit. Then, you go back, and take your last edit with a wipe to the first frame of your next edit, then cut into your video for the next edit. Then, what you have is your A video wiping to B(freeze) and as soon as the wipe is over, B starts playing. That's how you faked online editing in an offline suite with an FX processor that has a framestore.
Thank you for the information! That definitely helps make this make more sense! The offline vs. online editing makes sense to me. I wouldn't mind completing this as an offline setup for video production. It could be fun to try some day. I have definitely seen the framestore technique before. I would have loved it if this had it because that "feels" more era appropriate for a medium budget production.
So broadcast programming, (et al, news programming, broadcast journalism, etc.) were always online? Could you have pulled it off if you had more budget equipment like that, for example if you were a start-up and wanted to get into the broadcasting industry? This stuff is SUPER fascinating!!
Robert Butler - it really depends. Most of the time, it would be done in an online suite for a lot of reasons, including having a computerized EDL (edit decision list) making recreating or recutting the material easier. It was pretty common for the initial cuts of shows being made in offline suites with detailed notes (this is where SMPTE Time code comes in) of where the cuts are and what FX are needed where. Remember, in this world a simple dissolve is a special effect that costs real money. A LOT of stuff never saw an online suite just because of the $$$ involved. There are a lot of shows that were done 90%+ in offline suits (a cut is the most common transition - you just stop seeing them and they are natural) because you would put the effects in the pre-produced opening, cut your whole show in an offline suite, and then take those big blocks into an online suites to do dissolves and the titles. It wasn’t that odd for an EDL in an online suite to look like, fade in opening, dissolve to first segment, fade to black, fade in second segment, fade out, fade in third segment, dissolve to pre-produced end (or black) and key in credit scroll. Everything else had been done in offline suites to make the segments.
For those intrested on the katakana on the copyright of the Sony chip here is a fact ASCII the Japanese company also made semiconductors in collaboration with Sony at one time, also they were part of the Microsoft MSX computer project that was not so popular in Japan due to the NEC 88/9800 and the Sharp X1/X68000, also they were a software company Today, ASCII specializes more in media stuff (ASCII MediaWorks) ASCII is spelled asuki (アスキー) :3
I really miss the 80s,90s electronic stuff, big ass mobos with plenty of chips with thru-hole legs, you can repair/mod everything, no SMD, no BGA, no tiny devices, just pure raw discrete madness!
The sticker on the back that says "1D" is a date code. 1= 4th digit of the year (1991) D= 4th quarter of the year. They've been using this for decades. They still use this type of date code to this day.
As far as I know, these were for color correcting while recording from betacam to a VCR and you'd use the color bars at the beginning for monitor adjustment since it was (and still is) used quite heavily for things like news reporting and archival footage.
11:40 Is there some separate clock signal running between the boards, to keep the signals in sync? Then this is used to mix a sync pulse into the video signal before it comes out of the box.
Great video; I came across this guy's little bother, the XV-C700, at work last week. Really cool device that I wish I had more time to play around with.
I have just acquired one of these units and this video was very helpful. But I am interested in circuit bend my unit and was wondering if you had the PDFs for the chips and if you found the schematics for this. If so, would you please share them with. Thank you for making this useful video.
Maybe the built in color patterns are there for reference - just as a gray card or IT-8 color chart in photography? To correct the colors of a video to a standard - with a - erm - color corrector device...?
That probably is for chroma keying, mixing two video sources without genlock was rather difficult without digitizers and framebuffers, so it would be easier to chain devices. This would create the wipe and next device could use the chroma key to add second source etc to with genlock board, like with Amiga or something like that.
Adding a second video source means syncing 2 video signals together which either means equipment with genlock inputs (=professional equipment) or having a framestore (>700kbytes of RAM). So that's not an option on such a relatively cheap device. You can't just mix together 2 unsynced sources to make a new signal, but if you have 2 synchronous signals you can even use a BNC-T splitter to mix them together. If you want to do chroma keying, you need to look into video mixers. The consumer ones have the ability to sync sources together. Though chroma keying is rarely an option on those as composite/s-video doesn't really have enough chroma bandwidth to make that decently. Such a colour corrector was used to adjust errors in the video you just recorded. After all camcorders usually only had monochrome viewfinders so you had no idea how your image looked like chroma wise. Such a device can make your video look more professional or make your video have a certain look to it. The wipes are just an extra so you won't need to splice your image through so many boxes to make wipes, and colour bars are a nifty test signal to have as their generation doesn't cost that much. It is possible that this device works with RGB or components internally in which case adding wipes, luma keying and flat colour generation is trivial. Same goes for the microphone input, it's useful for hobby videoists who don't want to have an extra audio mixer. It doesn't cost much to add it so they added it.
@@leajiaure Chroma Keying is a feature commonly found in Video mixers and virtually every professional one. It's rare in consumer models. If you want do do chroma keying (actually Luma keying) with this device, you'd need an external frame synchronizer, the most common ones are also full-frame TBCs. Then you need to feed that synchronizer with the sync signals of the other source and in theory you can feed in the synchronized output into it. It's much more effort than just getting a video mixer with chroma (or luma) keying and doing it that way, however. Those things aren't as expensive as they used to be. As video mixers typically have one button per function you can just look for a "Chroma Key" button.
Can you imagine how much of a pain it would be to make something like this for HDMI? People don't realize how awesome analog video was for real-time video modification.
@@djdjukic Soooo... capture card, CPU, GPU, RAM, sound card, keyboard, mouse, monitor, Operating System, OBS (and to get OBS an internet subscription, modem, router, etc.), PLUS you may need an hdcp stripper depending on the source (and you'd better hope it's compatible with the version in question), all so you can (for instance) modify the green value VS with analog you just boost the green signal, worst case you have to convert composite/s-video to RGB and back. I get what you are saying though, I'm not trying to be a troll I just think people under-estimate the simplicity and elegance of older tech.
@@rubblemonkey6904 I know what you mean, and that would indeed set you up for an equivalent workflow. But there are also SoCs that can do what this device does, and they're used in mundane things like televisions. I agree that doing it all in analog was so intuitive, and that compared to NTSC, the HDMI standard is a several-hundred-page monster, but I really don't miss the signal loss and hundreds of trim pots. Well, not for everyday work. It's still a fun hobby :)
@@djdjukicTBH it seems you have WAY more hands-on experience than I do, mostly I've done research on analog video outputs of video game consoles and got so fascinated with it I proceeded to try and research the feasibility of having an analog TV comparable in quality to a 4K TV, so next to no real-world experience to speak of.
Providing your own video backdrop either requires a genlock, i.e. a synchronization for video generation, or a framebuffer. You can't just splice two analog video signals together if they aren't synced.
That is an interesting idea. I'll have to look at that at some point. But if I were designing this, I would have the inputs all be connected to a switcher, then whatever removes the sync pulse. So I bet there is only ever one de-synced signal inside.
There is no sync out as the wallpaper ic reads the original video sync and time itself to it. So you would never be able to a live video in hack, there is no frame buffer for one of the video streams, so impossible to merge them on top of each other. Amiga genlock worked the same way, no frame buffer needed if you can lock on to the main external video h+v sync and generate the graphics as a slave, bypassing the normally generated h+v sync done by Amiga chips.
Wait a minute, how the hell did I never notice, I used something like this in AV class years ago! These things are classic, and I'll also never figure out what the microphone was for
After finding one at a rummage sale this device became a fixture of my party house. Hooked up to an old RCA Lyveum TV this was the device that my idiot friends enjoyed more then the devices connected.
Wow this is so interesting, I also read some comments and OMG I love this, I wish I could buy one, on eBay are some pretty cheap, but the sellers are from the United States and they don't make shipments to here, or they do but from 50 € minimum ;-;
Crisper and sharper than 64 bits? Man I bet the Atari Jaguar was a massive success, got a ton of devs on board and sold hundreds of millions of units...oh wait, I remember now 😢
Shit, I have a simpler device that does the same thing, except it just sharpens or softens the image. I used it with Laserdiscs till I got my receiver which does well with analog video sources.
Wow, ich hab echt nicht damit gerechnet, dass es noch andere Deutsche gibt, die so verrückt sind wie ich und AkBKukU videos gucken... Bist du auch bei Druaga1 anzutreffen?
This is an older video I had saved as a backup for a while. It was done in July before I got my current microphone and before I tried to do color processing on my videos. For reasons that will be clear soon it make sense to just release it now.
I have to admit, I somewhat like(d) the stereo effect from the mics.
Lemme guess, you got urself some chromakey gear?
You can download a manual here:
www.digitalfaq.com/forum/video-restore/6188-xv-c900-vs.html
@AkBKukU use the allready desync-ed video from one input from the back
Looks like it already has two video inputs. here is the link to the manual from SONY.
docs.sony.com/release/XVC900.PDF
Okay, so here's the oral history from a video greybeard. This was from when I was doing industrial video, and this is how this was actually used.
First, I need to tell you this story to to tell you the other story. You have to remember that this is the era of linear, tape-to-tape editing. That is who this is aimed at. You had two main types of editing, called on-line and off-line. (Nothing to do with networking as we know it.) On-line editing had big tape rooms of synchronized machines and computerized editing systems that could roll two machines at once (A-B Roll Editing) or even more (A-B-C etc). Those would let you do wipes/dissolves/FX from two video sources. On-line was big, expensive, and aimed at broadcasters. This product was not aimed at that market -- this was aimed at offline editors, guys doing training videos and wedding videos -- usually called Industrial Video.
Offline editing was two machines -- source and master. You never had a way to have two video sources synced and going at once. That's why this wasn't made for chromakey -- chromakey requires two sources. (You have luma-key on it, and it was used to solarizing type FX.) Instead, this was used to sorta "fake" having an A-B roll -- You would wipe to a color (usually black), stop your edit, begin your next exit in black, and wipe out to your source. (Most modern computer editors still have this effect under "dip to black" or fade to black, which is what we called it then.)
In this context, the other things make a little more sense. The mic input is so that the guy making the lame training video for MegaCorp Industries can do his voiceover while the video is being edited (yeah, you did that sort of thing live, like Bill O'riely) and the color bars were both for setting up your monitor and laying color bars at the start of your master so that the next guy can set up his system to match yours.
The main thing that surprises me is that there is no GPI trigger on it that I can find. You would normally expect something that can do these wipes to at least have a trigger input, but I supposed in 1991, you still expected to still do them manually. (My timing is still pretty good just from having to hit things as close to frame-accurate as I could in projects using gear like this.) A GPI trigger would let your edit controller tell it when to take the FX. On higher end Sony gear, there would be RS-442 or even a direct sony-specific connection to the sony edit controller. This is the sort of gear you would put into your signal path when editing with an RM-450 (the offline editing gold standard) although it looks like Sony intended this for something like the RM-E700 controller.
FWIW, I was still occasionally using an RM-450 into 2010 or so to control a source deck for encoding old tapes to MPEG.
One more thing I thought of -- later in the 90s, you would see this sort of product include a frame-store (prohibitively expensive in the early 90s) and you would get exactly the sort of functionality you expected. You would play your source, his a button to grab the frame, and then you could wipe between the live video and the framestore.
If you watch closely on cheap late-90s TV, you'll see this trick being used: you take a frame store of the first frame of your NEXT edit. Then, you go back, and take your last edit with a wipe to the first frame of your next edit, then cut into your video for the next edit. Then, what you have is your A video wiping to B(freeze) and as soon as the wipe is over, B starts playing. That's how you faked online editing in an offline suite with an FX processor that has a framestore.
Thank you for the information! That definitely helps make this make more sense! The offline vs. online editing makes sense to me. I wouldn't mind completing this as an offline setup for video production. It could be fun to try some day.
I have definitely seen the framestore technique before. I would have loved it if this had it because that "feels" more era appropriate for a medium budget production.
So broadcast programming, (et al, news programming, broadcast journalism, etc.) were always online? Could you have pulled it off if you had more budget equipment like that, for example if you were a start-up and wanted to get into the broadcasting industry? This stuff is SUPER fascinating!!
Robert Butler - it really depends. Most of the time, it would be done in an online suite for a lot of reasons, including having a computerized EDL (edit decision list) making recreating or recutting the material easier. It was pretty common for the initial cuts of shows being made in offline suites with detailed notes (this is where SMPTE Time code comes in) of where the cuts are and what FX are needed where. Remember, in this world a simple dissolve is a special effect that costs real money.
A LOT of stuff never saw an online suite just because of the $$$ involved. There are a lot of shows that were done 90%+ in offline suits (a cut is the most common transition - you just stop seeing them and they are natural) because you would put the effects in the pre-produced opening, cut your whole show in an offline suite, and then take those big blocks into an online suites to do dissolves and the titles.
It wasn’t that odd for an EDL in an online suite to look like, fade in opening, dissolve to first segment, fade to black, fade in second segment, fade out, fade in third segment, dissolve to pre-produced end (or black) and key in credit scroll. Everything else had been done in offline suites to make the segments.
I used to drool over the ads for this type of Sony and Videonics gear in Video Review and Video magazines when I was a teen.
For those intrested on the katakana on the copyright of the Sony chip here is a fact
ASCII the Japanese company also made semiconductors in collaboration with Sony at one time, also they were part of the Microsoft MSX computer project that was not so popular in Japan due to the NEC 88/9800 and the Sharp X1/X68000, also they were a software company
Today, ASCII specializes more in media stuff (ASCII MediaWorks)
ASCII is spelled asuki (アスキー) :3
Nice addition, I was wondering "What the heck, is that a text generator there? Why would they spell it in katakana then??"
Techmoan would love this device.
I really miss the 80s,90s electronic stuff, big ass mobos with plenty of chips with thru-hole legs, you can repair/mod everything, no SMD, no BGA, no tiny devices, just pure raw discrete madness!
smds are not that hard to work with. A true annoying thing to deal with is multi layer boards.
@@crimsun7186 absolutely agree!
I think the color bars are for sources that can provide a color bar output, which you can then match with the color adjust knobs.
The sticker on the back that says "1D" is a date code.
1= 4th digit of the year (1991)
D= 4th quarter of the year.
They've been using this for decades. They still use this type of date code to this day.
You have more info about this code?
As far as I know, these were for color correcting while recording from betacam to a VCR and you'd use the color bars at the beginning for monitor adjustment since it was (and still is) used quite heavily for things like news reporting and archival footage.
You could use that thing to bypass Macrovision scrambling of analog video...
11:40 Is there some separate clock signal running between the boards, to keep the signals in sync? Then this is used to mix a sync pulse into the video signal before it comes out of the box.
the fader/dark white "video art" is perfect for greenscreen use in things, probs very nice for tv broadcast
Great video; I came across this guy's little bother, the XV-C700, at work last week. Really cool device that I wish I had more time to play around with.
I have just acquired one of these units and this video was very helpful. But I am interested in circuit bend my unit and was wondering if you had the PDFs for the chips and if you found the schematics for this. If so, would you please share them with. Thank you for making this useful video.
Color inversion, chroma key to solid color backgrounds, wipe transitions. I'm guessing many cheesy early 90s videos were made on this thing!
the cheesier videos were made on Amiga with the Video Toaster
I think the color-bar thingy can be used for prototyping either a scene-transition or an overlay
Maybe the built in color patterns are there for reference - just as a gray card or IT-8 color chart in photography? To correct the colors of a video to a standard - with a - erm - color corrector device...?
That probably is for chroma keying, mixing two video sources without genlock was rather difficult without digitizers and framebuffers, so it would be easier to chain devices. This would create the wipe and next device could use the chroma key to add second source etc to with genlock board, like with Amiga or something like that.
Adding a second video source means syncing 2 video signals together which either means equipment with genlock inputs (=professional equipment) or having a framestore (>700kbytes of RAM). So that's not an option on such a relatively cheap device. You can't just mix together 2 unsynced sources to make a new signal, but if you have 2 synchronous signals you can even use a BNC-T splitter to mix them together.
If you want to do chroma keying, you need to look into video mixers. The consumer ones have the ability to sync sources together. Though chroma keying is rarely an option on those as composite/s-video doesn't really have enough chroma bandwidth to make that decently.
Such a colour corrector was used to adjust errors in the video you just recorded. After all camcorders usually only had monochrome viewfinders so you had no idea how your image looked like chroma wise. Such a device can make your video look more professional or make your video have a certain look to it.
The wipes are just an extra so you won't need to splice your image through so many boxes to make wipes, and colour bars are a nifty test signal to have as their generation doesn't cost that much. It is possible that this device works with RGB or components internally in which case adding wipes, luma keying and flat colour generation is trivial. Same goes for the microphone input, it's useful for hobby videoists who don't want to have an extra audio mixer. It doesn't cost much to add it so they added it.
Does a device exist that could be used with this one for chroma keying a second video source? How about some specific video mixer models?
@@leajiaure Chroma Keying is a feature commonly found in Video mixers and virtually every professional one. It's rare in consumer models. If you want do do chroma keying (actually Luma keying) with this device, you'd need an external frame synchronizer, the most common ones are also full-frame TBCs. Then you need to feed that synchronizer with the sync signals of the other source and in theory you can feed in the synchronized output into it. It's much more effort than just getting a video mixer with chroma (or luma) keying and doing it that way, however. Those things aren't as expensive as they used to be. As video mixers typically have one button per function you can just look for a "Chroma Key" button.
AkBKukU, I think the reason color bars are an option for a background is for when you want to calibrate your monitor to prepare for post-processing.
Can you imagine how much of a pain it would be to make something like this for HDMI? People don't realize how awesome analog video was for real-time video modification.
Basically you need a PC with a HDMI capture card and OBS :)
@@djdjukic Soooo... capture card, CPU, GPU, RAM, sound card, keyboard, mouse, monitor, Operating System, OBS (and to get OBS an internet subscription, modem, router, etc.), PLUS you may need an hdcp stripper depending on the source (and you'd better hope it's compatible with the version in question), all so you can (for instance) modify the green value VS with analog you just boost the green signal, worst case you have to convert composite/s-video to RGB and back.
I get what you are saying though, I'm not trying to be a troll I just think people under-estimate the simplicity and elegance of older tech.
@@rubblemonkey6904 I know what you mean, and that would indeed set you up for an equivalent workflow. But there are also SoCs that can do what this device does, and they're used in mundane things like televisions. I agree that doing it all in analog was so intuitive, and that compared to NTSC, the HDMI standard is a several-hundred-page monster, but I really don't miss the signal loss and hundreds of trim pots. Well, not for everyday work. It's still a fun hobby :)
@@djdjukicTBH it seems you have WAY more hands-on experience than I do, mostly I've done research on analog video outputs of video game consoles and got so fascinated with it I proceeded to try and research the feasibility of having an analog TV comparable in quality to a 4K TV, so next to no real-world experience to speak of.
@@djdjukic Being from the US I've never heard of V2000, fascinating stuff! Which did you personally prefer, VHS or V2000?
I wonder what the MSRP on that was when it was new.
Providing your own video backdrop either requires a genlock, i.e. a synchronization for video generation, or a framebuffer. You can't just splice two analog video signals together if they aren't synced.
Hey, what about Transitions between 2 Video channels like on TV?
Got Source on channel A and fade from A to B?
It makes sense that it can't accept a 2nd video signal. It would go out of sync with the first one unless it was genlocked.
Sony, I love you and your crazy ways.
What's the model number?
It's a Sony XV-C900 Color Corrector. I think I said it in the video but I'm not sure. I'll add that to the description as well.
@@TechTangents Thanks for replying! I need something like this for digitizing VHS. I wonder if it does time base correction.
@AkBKukU use the allready desync-ed video from one input from the back
That is an interesting idea. I'll have to look at that at some point. But if I were designing this, I would have the inputs all be connected to a switcher, then whatever removes the sync pulse. So I bet there is only ever one de-synced signal inside.
Wait, Miracle Designs? Like rascal racers Miracle Designs? I never knew they made a game on the jaguar.
There is no sync out as the wallpaper ic reads the original video sync and time itself to it. So you would never be able to a live video in hack, there is no frame buffer for one of the video streams, so impossible to merge them on top of each other. Amiga genlock worked the same way, no frame buffer needed if you can lock on to the main external video h+v sync and generate the graphics as a slave, bypassing the normally generated h+v sync done by Amiga chips.
Looking for this unit for my linear editing suite
Make that "... *its* best feature" -- no apostrophe. :-)
Oops, this was an old draft description and I never went through and checked it. I'm surprised that was the only problem. Thanks!
that's pretty nifty, there has to be another part to it
Quickly looking i found this for the Chromakey device "Ambery PIPV3"
S-video is still better than regular composite signals. Do you use a framemaster?
Wait a minute, how the hell did I never notice, I used something like this in AV class years ago! These things are classic, and I'll also never figure out what the microphone was for
I can imagine you making a old style repair video with like old style editing and a low quality mic.
So one of these were used for all those TV shows in the 90s. Amazing how you just so happened to find one of these lol.
nope a lot of TV shows used the "Video Toaster"
awesome. luv the device. thanks for posting.
Can you use this as a colour corrector pal vhs tapes to digital ?
Look for the owners' manual, it gives some examples (e.g. video editing) of what can be done w/ it.
Did you ever find an analog chroma key device to use with this? Does anyone know if that exists?
It looks like it already has 2 inputs on it.... The top and bottom and then the source 1/2 switches on the front.
Yes, but that only switches the foreground video. I want to change the background video
@@TechTangents thanks for explaining this. I had same question regarding the existing inputs.
Huh. Odd. Then I’m stumped. (And I work in television.)
After finding one at a rummage sale this device became a fixture of my party house.
Hooked up to an old RCA Lyveum TV this was the device that my idiot friends enjoyed more then the devices connected.
Feels like that thing was made for movie editing stuff.
Off topic comment but. The MOD Music from Atari Karts is probably the best OST I've heard.
And Tempest 2000 too!
The games from that era have really amazing ost (especially mod music)
I wonder what the inside of a Quantel Painbox looks like ...
This is for TV broadcasting in the 90's, so you can do all kinds of transitions between programs in real-time.
Are you done with the broken Commodore Floppydisk Drive ?
No, check the pinned comment.
But I'm waiting on some parts to come in for that project as well
Wow this is so interesting, I also read some comments and OMG I love this, I wish I could buy one, on eBay are some pretty cheap, but the sellers are from the United States and they don't make shipments to here, or they do but from 50 € minimum ;-;
No rgb though? Why not?
Amazing, thank you for your explication it's very interesting
5:55 Ithis gave me flashbacks to minecraft redstone engineering lol
Fading to green is pretty useful...
Crisper and sharper than 64 bits? Man I bet the Atari Jaguar was a massive success, got a ton of devs on board and sold hundreds of millions of units...oh wait, I remember now 😢
I like how you added a voice in editing instead of just a title, really helps us blind folk out.
I have got Sony 2 video processors XV-C700 composite and XV-C900 composite + Y/C 😜
After opening it up I would have said "Nope .I can live with what ever the people at Sony designed".
God that last part is so VHS.
It's just s-video
I want one kust to look cool on my audio rack
I imagine this device would cost you an arm and a leg back in the day. Such a monstrosity couldn't have been cheap to make.
1991 sounds accurate. I calculated like late 1980s and early 1990s.
Shit, I have a simpler device that does the same thing, except it just sharpens or softens the image. I used it with Laserdiscs till I got my receiver which does well with analog video sources.
Sherpaning mask, what´s that? Does it make it take a hike?
You have a working Jaguar CD?
Either you must be lucky or you know what you're doing xD
First thought: The formfactor of that thing is very odd.
it's not when you ge the matching VTR and start stacking things up
@@83hjf i mean the formfactor says consumer AV gear to me, but the functionality says commercial use.
The real nightmare of not having multi layer boards and smd components...
end of the video I thought I was watching an old uxwbill candyham video or something from slugbug.
I finally know how to pronounce your name!!!
instead of hacking it i wouldve gone back the next few days to the goodwill and got the rest of the pieces they had ;)
10:22 Did somebody mutter “Apple”?
Before MAGIX Vegas...
not only do you own a jaguar
you have a jaguar cd
AND a 6 button pad
i could smell the paper pcb thru the screen
Final Cut Pro....the first edition.
Wow, the audio was real shite back then.
I have one of these 😂😂
Who's before 200 views?
waw
early gang
its cool
500 like
My content is pure Scotland. None if it is real I don't know who you say.
No true Scotsman doesn't give your money back when you ruin everything.
"Sherpining mask."
Ehee... It's not funny...
First
Das ist nicht richtig. Not right. I was first!
Mh. It changed, well played.
@@aveavis2247 are u sure?
@@soleroks Eigentlich schon. I leave you the post gladly. :-D
I hope I translated it right.
@@aveavis2247 yes,right but im not know english good too :D
Erster!
Wow, ich hab echt nicht damit gerechnet, dass es noch andere Deutsche gibt, die so verrückt sind wie ich und AkBKukU videos gucken... Bist du auch bei Druaga1 anzutreffen?
Immer diese Piefke...