TD here. I use the manual T-bar mostly during artistic programs where the feel of the timing or pacing of the program and transitions is highly subjective. For example I'll use it during concerts to "feel" out the speed of dissolves/fades live on the fly in sync with the musicians. E.g. The camera zooms in slowly as the violinist draws out that last slow sensual note and we fade in time to the wide shot where everything "lands" in harmony and it just "feels" "right". My favorite moments of TV magic are when the all the parts, including time (which is what the T-Bar is for) come together to make something special.
Good TDs are essential for big broadcasts, and this adaptability is a big part of it -- understanding the content and intentions not just "pushing buttons in order"
yeahh from experience the davinci hes using has the worst, crunchiest, most frictiony t bar ive ever felt, worse than those cheap ones that use audio faders
Re: "who uses the cheezy transitions?" I used to do technical direction for live events, rock concerts, churches, etc., and the answer is churches. Churches can't get enough cheezy transitions! And I would imagine the vast majority of switchers ever purchased were for churches. They are also used a lot for "image magnification" aka "imag" at live events. Probably the funnest work I ever did in live video was calling shots on those things, it's like playing a video game and you get to wear a big headset and play with the jet fighter controller t-bar!
When the community TV station I work at started in 1993, our first switcher was home-made. It was just a button that mechanically switched between the two consumer VHS camcorders that served as our first studio cameras. It was dubbed the Bang Box, and when you look at our earliest shows, the picture would roll every time you switched to the other camera because of the lack of sync.
a bang box! that's what we use to call a raw non-vertical interval video switcher too! Some times all you need is state of the budget, and not state of the art, to get the job done! :)
@@CathodeRayDude Other things I'll add: I've never heard the term hot-punching before, and I find it funny that it's considered taboo sometimes. We use both methods depending on what we're doing. If we're taping a talking head show with three cameras, we'll hot-punch because you can just keep three fingers on the Program row and it's easy. If we're doing something like Bingo that uses the Key inputs on the switcher to overlay cameras on top of the bingo graphics, you can't hot-punch because it will change the bingo graphics to a camera that has other cameras keyed on top of it, and it's chaos. Ask how I know 🙃 As far as transitions go, the only thing we've ever used besides a dissolve or a wipe is during our live Christmas telethon, we'll use a media wipe with pretty snowflakes to transition between the studio and the pre-recorded clips of kids singing. The only thing I've used the T-bar for was to do really slow dissolves when we had a musician in the studio giving a performance. That's how you also do that thing where you partially dissolve between a wide shot of the musician and a close-up side view of their face. On a good switcher, the T-bar is dampened with what I can only assume is really thick grease, so it's very easy to get smooth transitions using it.
As a broadcast TD; your first example is pretty spot on. I’m always worried people will figure out how simple it is to do at its core 😅 I explain what I do to people as “I use a really expensive keyboard” or “I push buttons”
@@strapsgamingvids TBH, unless you’re in a top 10/25 market you’d probably make more working at Starbucks 😉 I was market 100 or so and baristas were earning more than I was directing/TD-ing at a broadcast news station.
@@beardsplaining see even in shit like this nerds end up getting fucked over :( I'm sorry to hear that dude I hope things improve and at least its steady and stable with some perks.
@@xLilJamie I work in public broadcast now. Infinitely better than commercial stations in so many ways, especially pay and benefits 😉 Broadcasting is the career I picked up as I neared 30 and got burned out on physical labor. Been in the industry almost 10 years now and make a decent living as a Director/Videographer/Editor
I think thats the core of a lot of trades and skilled service. A network engineer is just plugging cables into holes of the right shape and choosing the right color. Auto mechanics just take stuff off and put other stuff on. Plumbers just install and cut pipe. Television directors just cut between cameras. Technically it's all true, but you can rest assured that sticky any layperson in there would result in them being utterly confused about 30 seconds in as soon as a second task in the job needs to be done.
Glad to see the mixer I saved from e-waste and sent to you will live on forever in this video! I did a bunch of shows on this unit and you found features on it that I didn't know existed (the dip wipe!) but you're right - we never used the fancy DVEs, it was mostly cuts / dissolves, downstream keys and chromakey. I'll point out that the grass valley switches have the program bus at the bottom, and echolab had the preview at the bottom - for someone that would do weekly shows at 2-3 different studios I had to always remember to switch - or else you might cut live at the wrong moment! (and that did happen!). Great video!
Can we take a moment to appreciate the sheer number of cameras in the same room all of which he's managed to hide from each other for this production? I think I've counted 2 HD angles and either 2 or 3 SD angles.
oh man the room was VERY crowded. I didn't take a picture once it was all set up, but you're absolutely on point that it was hard to squeeze it all in. I had two Blackmagic pocket cinema 6Ks running, one on a big tripod with a teleprompter, one on a little manfrotto video tripod, but both trailing cables across the room; then I had another big tripod with a Sony shoulder-mount ENG camera, and a second Panasonic ENG camera sitting on a pile of boxes on top of a rolling cart on the other side. A huge cable snake down the center of the floor to get video in and out of the mixer rack module in the other room, plus the boom mic stand and cabling, and three LED floods with softboxes. Me and my gf (camera op) had to climb over cables and squeeze around the back of the presentation desk every single time we had to move anything. I am INCREDIBLY lucky that my viewers make it possible to afford a studio space to do this in; it would have been literally impossible to shoot this video at home, it just requires too much gear!
@@CathodeRayDude And it makes me sad to see that the vast majority of viewers (not only here) won't ever be aware of the insane efforts required to produce something that most of them would regard as "normal low effort crap". On the other hand, you tend to attract the kind of audience that IS aware of this, which is part of why we're coming back for every second of your wonderfully nerdy and in-depth stuff, even if it's an off the cuffs side note. You care, and it damn well shows. And we love it!
Hopefully he doesn't reuse. I have been using MEVOs which means I can cut to one camera and adjust another and come back and it looks like I have 4-5 cameras.
I don't understand. Why did it require multiple cameras and all that to film himself behind a desk? Honest question because I don't understand what was going on.
I really enjoyed this video. Having worked in broadcast TV news for 10 years, and now running my own production company that specializes in on-location, live, multi-camera production, I use video switchers on a regular basis. So, seeing the collection of video mixers you had gathered here was a lot of fun for me. There are a few things I think I might be able to clear up. First, on the topic of A/B style switchers (SFX generators) versus Program/Preview bus switchers- A/B switchers, like your Panasonic switcher in this video, were mainly used in video editing applications, where program/preview style switchers were/are for use in live multi-camera production. Before computer based video editing was really a thing, video was edited tape-to-tape. The simplest form of this was having a playback deck and ar record deck. You would have an edit controller, or controls on the tape decks themselves, to mark in and out points and record just those selected sections to your program tape. However, if you wanted to edit something that was more complicated, and had more than cuts only editing (an entire TV show for example) you would have multiple playback decks and your record or program deck. A special effects generator, also known as an A/B style video switcher, would connect all the source decks to your program deck. You would use this along with an edit controller, that selected your various in and out points and controlled tape transport, to actually edit the program, with your video mixer performing the various dissolves, or keying in things like titles. This is why the Panasonic video mixer has built in audio capabilities, for four audio tracks, the standard number of tracks that were supported on professional Betacam tape. This is also why you have the program select button, you could select to send the program output to a third playback deck, in case you needed to dub scenes from one playback tape onto another, to allow you to perform an edit where you dissolve between them. Typically, the edit controller would also have a preview button, allowing you to preview an edit and rehearse a dissolve or affect before you hit the edit key and committed it to tape. So a preview bus output on the mixer wasn't needed. The reason why you had an A/B bus, instead of a program/preview bus, is that this kind of editing came from film editing, where you would have an A-roll and a B-roll. These kind of video mixers were also known as special effects generators because that was their intended purpose, to insert special effects in editing, not to be used for live television production. In an online edit suite, where you would edit a full and complete TV program, you would have multiple playback decks, a master record or program deck, an edit controller, a character generator, a special effects generator (or A/B switcher), and a basic audio mixer if the switcher didn't already contain one. The second thing I think I could shed some light on is why a TD would use a T-bar for a transition instead of the auto button- There are two answers to this- First, sometimes you just want to. There isn't a reason to use the T-bar instead of pushing the auto button, but when you've been doing a 4 hour long broadcast morning show, using the T-bar every now and then helps mix up the job a bit so you don't fall asleep. However, the bigger reason is something I do frequently in my work, which is during live coverage of musical performances. Sometimes during a slower song it is appropriate to do an extra slow dissolve, while other times a dissolve is appropriate, but you want it to be a lot faster. Next, I'd like to address how and when all those DVE transitions and wipes are actually used and what they are there for. It is true, on a basic 1 M/E switcher they don't have much use. However, when you are switching a program with multiple M/Es you can use those DVEs to build various scenes, combined with keyed in graphics on the M/E you could build things like boxes, where you have multiple reporters on location and your anchors in the studio for example. In these cases, the T-bar on these M/Es might be left in a partial position. But you can get these elements setup on the M/Es before the show, or save them as a macro, depending on the switcher you are using, and then during the program bring up that scene you built in the M/E as a source. The DVE's aren't really for doing transitions, but for building these kind of scenes or elements.
"film editing, where you would have an A-roll and a B-roll" Wait, is this where the term "B-roll" referring to some unrelated shot you'd cut to when you want to hide an edit came from?
@@scorinth Basically, yes. Before video tape, TV news stories were shot and edited on actual film. Back in the day it would be normal to have a news anchor give a quick brief of a story, the promise, "Film at 11:00." For quickly editing these news stories, you would have your A roll and B roll. Your A roll would have your package's sound track. You would edit your A roll first for sound. Then, you would splice in your B roll material. When the movie was made to videotape, these terms remained. Even though the tech was now different, you would still edit a news package for audio first, then go in and insert your "b-roll" material, even though your B roll was now a tape instead of an actual film roll.
I'm a Technical Director for a news station. you got the basics! DVE can be super useful. You can place multiple cameras in PiP (picture in picture), then make another scene with those PiP windows in different places. Store them as memories and when recalling them, the mixer will interpolate/animate the movement of the PiP windows. I use that for taking either the outdoor studio camera or weather map, gives the PiP windows a neat zoom in transition. Our lower 3rd and full screen gfx inputs are still called CG, but they're fed by a PC playing back transparent pro res files. What in gods name was that audacity skin? I like it!
PCs are great for graphics -- easier to design and easy to rewrite stuff on the fly -- plus it doesn't directly sit in the pipeline so worst case you lose the graphics but keep the video switching
One other major massive difference between hardware and software based mixers is latency; how long it takes for a signal going into the mixer to come out the other end. A software based mixer needs to capture each frame into memory, transfer that frame over into the software, let the software process that image, and then render it back out to a frame buffer where hardware can turn it back into a video signal or an encoder can compress it into a stream. Couple that with most software mixers working with compressed video sources that also take time to compress, move, and uncompress for processing, and the time adds up more. The latency for a camera through a software mixer can be from a considerable fraction of a second to many seconds long. Thankfully most software mixers handle the audio too, so they also have some chance of keeping it in sync with the video. The delay in most professional hardware mixers is less than a single line of video end to end when working with synchronized source signals, and typically one frame of working with DVEs. If you need to buffer your incoming signals to synchronize them, that adds up to 1 frame, on average half a frame. Having a minimal delay of course is important to keeping things lined up with the audio that's being processed elsewhere, where a short consistent delay can be compensated for. But while a TV or streaming audience won't notice or care about latency, anyone watching on a screen in the same space as the action; like a large video screen at a concert, sports arena, church, or conference hall; is going to notice that delay really quickly. When feeding cameras to screens, we want the light that went into the camera's lens being shoved back out to the projector as quick as possible. We'll run sync/genlock cables out to our cameras and turn off all the extra frame buffers along the way to try and keep that delay at under a frame (one 30th of a second [~33ms]) all the way through. Which is how long it takes for the sound from the speakers to move about 37 feet, which is a reasonable distance to many of your closest audience members for things to still line up in our brains.
Having in count that streaming services have a quite big delay by default, in the order of 20-30 seconds at least and usually streamers force an artificial delay of minutes to avoid game ghosting or manage chats, it's understandable that software mixers are good enough for that while that latency is bad for live TV.
@@fordesponja that sounds quite bad, and im not sure what service your referring too, twitch often has sub 10s latency(measured in a not very accurate way of chat messages and when they show up on stream, ive seen the latency debug output get as low as 1.5 secs, but im not sure what that’s measuring) and there where some attempts to get it quicker with the likes of mixer
This is a side effect of high level abstracted and modular thinking of modern software dev. Instead of having specific wiring for a specific effect, you put together this input card, general purpose processor, and output card, and now you can do *anything* to any part of the frame, which is really awesome - except it adds some frames of latency. Open your phone camera app wave your hand in front of the phone - you can see the screen is delayed. Your phone is capturing a whole camera frame, then the whole frame gets passed through some specialized processor (to save power) that does brightness and colour balance stuff into the main system RAM, then the camera app notices the new frame is ready in RAM and queues it up for the next time the screen refreshes. When the screen refreshes the graphics system puts together all these pending sub-frames (status bar, app, soft button overlay) and now that's ready for the *next* screen refresh. It's basically put together by the creator of each part thinking about what to do with entire frames and not individual pixels as they come in, so each part of the system adds 1 frame of latency. To be clear, it's totally fine for a lot of applications. --- There's no theoretical reason a software couldn't process pixel-by-pixel, but it would probably still add a few pixels of latency so genlocking the output with anything else would be tricky, but if it's the final output maybe you don't care about that.
@@fordesponja even for tv latency is not a problem. but for a live show, where the audience see the stage and projection at the same time it matters a lot. i try to not even use any digital image correction on projectors to minimice latency, an the mixer i mostly use has 8 lines of delay (0.07ms @4Kp50)
@@miawgogo competitive Twitch streamers add that artificial delay. I don’t watch Twitch anymore but I’ve seen anywhere from 5-45 seconds in like 2010-2016
41:50 One application for using two _program_ versions is during a concert. You can have one output going to be tapes, or broadcasted. While another output is for sending a signal to on-stage screens. This way you can have a wide shot aired, with a close-up of the singer on the screens that are located on the stage. Having the same image would result in the Droste effect, where the on-stage screens would display the same shot as the wide angle shot. So, the shot repeats itself in the screen... over and over again.
First time I've heard that name too, thanks! We always call it the "infinity tunnel" or "infinity loop"; though sometimes I also call it "video feedback" in technical discussions.
@@hyvahyva The name "Droste Effect" comes from a brand of cacao powder. On the red packaging there is a woman holding a can of cacao powder. And on that can there is woman holding a can... This is why it is called the "Droste Effect". But, calling it video-feedback or pixel-tunnel will work. As long as the people in the conversation know what you mean, right?
Many years ago I was operating a camera for a show where the director mistakenly cut to a particularly bad case of the looping screen, their response was just a somehow comedically perfect "whoops". As long as I worked with that company, it was henceforth called the "whoops effect". 🙃
When you said on-stage screens I thought you meant little monitors for the "talent" to watch. Now I realize you mean the giant display for the audience located above the stage.
I took a video production class in middle school in the 90's that did the morning announcements for the school. We had a very cheap A/B switcher and an even crappier CG machine. It is really cool seeing more professional gear. And as immature middle schoolers we used every single type of wipe that the switcher could do, usually using multiple different ones in a single morning.
The split screen was used on the news in the UK news outlets to get around the social distancing, by having the set very well designed. So that you could get both presenters sitting side by side and would only look really odd if a coffee cup was in the middle. Also, they used it for a program in the 70s or the 80s to get two quiz teams to appear above or below the other.
University Challenge! People finding out they don't literally have double decker quiz booths and it's just a vertical cut and getting angry is evergreen
Fun fact: If you were downloading TV shows from the Internet back in the day, they would occasionally come out _before_ the show's airtime. These almost never had the station logo in the bottom corner because that's only inserted, as mentioned in the video, when the program gets played out. Not to get too deep into the weeds, but the station logo is typically inserted as the program is played out to the transmitter. If the logo isn't present on the program, that's called a "clean feed", which is how programs are typically transferred. The fact that the logo isn't seen on the downloaded program means it was captured from a clean feed at some point before the program was played out for broadcast.
Finally, someone else who remembers pirating pre-release shows! I'd get episodes of The Simpsons almost 24 hours before airing, it was great. My friends didn't believe me at first. I think I read that the release group (I.I.R.C. they were called FTV, on a tracker known as donkax) had a C-band style satellite dish pointed at the Fox distribution feed, which was unencrypted at the time. This was circa 2002-2004.
@@pap3rw8 hilarious lack of security on Fox’s part, to everyone else’s benefit! I always preferred the clean downloads for visual reasons, and often wondered why only some came that way. Especially since DVDs often weren’t out yet. Now I know!
@@BobbyGeneric145 Either that or, as pap3rw8 pointed out, they got it from unencrypted C/KU band feeds (it's also worth mentioning that, even if the feeds were encrypted, sometimes they were encrypted using a system that was plagued with security flaws, and decoding the feed was as simple as inputing a HEX key on the receiver, see the history of Nagravision systems for an example of that)
@@BobbyGeneric145 some of the clean feeds are grabbed from distribution satellites. I had a buddy in the bay area that would grab them that way, he had a dish on a motorized mount that he'd just point to the network satellite he wanted to snag the feed from, and rip it to a .ts file.
A few comments from someone who worked as a vision mixer (which is "european" for TD) around ten years ago: - At least in my area, the only type of production I worked in where one would use preview for cuts was in news, because of the rapid, scripted cuts you would have to make. Your right hand would be ready on the cut button for the next cut, while the left hand was ready to preview what came after that, so you always had the next two cuts ready. Anything else we were taught to cut like the BBC vision mixer is demonstrating at 23:48, left hand for cuts and right hand for dissolves. Using the pvw row for cuts just adds an extra step that you shouldn't need; a big part of the job is always staying one step ahead so you know which cameras you can cut to before you need them. - About the wipe patterns and the other "fun" effects, you're right in that they hardly ever get used. Mostly they're used to hide the transition that happens behind a stinger (if it never goes full screen so you can just cut behind it), and even that usually just needs a regular linear wipe. But you never know when someone will have an idea or request for an effect or transition that needs it, so why remove them? There are other things in the mixer that depend on pattern generators anyway, so adding/keeping a few "extra" generators in order to keep the wipe functionality is probably no extra cost at this point. - ME steps are commonly used. Mainly, as you say, for "submixing" of effects. In most cases, you could in theory do everything on the program row, but it is much simpler and safer to prepare an effect on an ME so you can just cut to the finished effect, especially when you need to load different snapshots or effects during a show. If you build the effect on an ME, you can just quickly store a snapshot of the entire ME and know that it won't affect anything on your PP (program) step when you recall it. Another benefit of the ME's is that you can have more than one panel/control surface controlling different ME's on the same mixer. For example, on mid-size sports productions, we sometimes use the main control room in the OB truck for the "host" production and the "second production" room for the "national" production. Since they're technically just different ME steps on the same mixer, the national production have access to all the same sources in addition to the host output.
I worked at a small TV station many years ago. We had a Sony video mixer for production and the control surface was rack mounted, so embedded into the table was rack mounts so it could sit recessed in the table. So rack mountable equipment in a table is a thing.
TD here (Blackmagic ATEM Production Studio 4K), a lot modern video mixers have audio. One of the main uses is automation in AFV/VFA (audio follows video/video follows audio). But as you speculated the internal media players use it as well. It also is a great way to set limits, etc.
What the wipe and other transitions were used for was mostly wiping graphics onto the screen. Back in the day, your lower third super generator just output a static image, so to get a nice effect of it sliding or wiping onto the screen, the vision mixer was the thing doing the transition between the normal shot and the same shot with the lower third graphics _superimposed_ on it - hence the term "super". EDIT: Ok, you kinda got there yourself. But yeah, by the time this box was on the market, it was doing the keying of the graphics all by itself. Before that, it would've come out of the character generator (CG)/Chyron/graphics generator, into a keyed the graphics over the shot of the talking head or whatever, and fed that as a separate input into the vision mixer, which would usually be called the "graphics" or "Chyron" input, depending on how American your facility was. Often the graphics operator would have their own mini vision mixer, so they could overlay the graphics onto inputs separately to the main vision mixer.
JD the TD here. Loved your video and great job in explaining a very complicated subject in layman's terms. I am a broadcast TD and use the big giant Sony and Grass Valley Keyenne switchers shown in your video, on a daily basis, but have also used many of the older examples too. My first switcher in church, was the Panasonic WJ-50. The Preview output that you mentioned not being able to find was not to preview a bus, but to preview an effect like a chroma key or downstream key(dsk), etc before taking it. I also remember using a larger echolab switcher in a remote truck in the late 90's. It had a Pgm/Pvw bus, but the two additional M/E's were A/B buses. It was very confusing to have both on the same panel and "bagged" me several times each time I would get called to use it. It's like Echolab was too cheap to make all three M/E's flip flop like Pgm/Pst, but by having it on one, they could be competitive with other brands that had it on their switchers. I hated that switcher! The most challenging part of running the big broadcast switchers is remembering where things are located in the menus to do what you're trying to accomplish. Most productions that I have been a part of have a separate video director calling the cameras but there have been the occasional shows where I have done both or "called and punched" my own show. What the general public has to know is that television production is a field where there is very little tolerance for errors. It requires a great amount of focus and mental stamina. It's more that just hot punching two or three cameras, like in your video. It can be more like punching 5-7 cameras on a talk show or game show, and then being expected to record multiple shows, flawlessly, in the same day. Or being on a LIVE to air show like the news for four hours straight or cutting a music show or sporting event with more cameras than you have fingers and being expected to hot cut on the beat. By the end of the show, you walk away and your mind feels like a bowl of jelly. It is not an easy job in the least bit, but if you are good at managing a lot of signals and multitasking, can be very rewarding.
I worked as an interpreter for a news network for about six months and I have never more than maybe ten keys on what was a gargantuan mixer. Audio was strictly a separate affair. The t-bar was frequently used though. They would use it to perform cuts from different sources. Sources would be news agencies. Here's an overview of what went down, or whatever I deduced from observation: The network I worked at didn't have any news broadcaster or any sort of live feed that is shot within the premises, i.e. the modus operandi of Euronews. Agencies have constant video and audio feeds that the network had subscribed to. There are monitors in the production room that exclusively display one particular agency. The director seems to have notifications come to them from a RSS feed or something like that about the contents of each agency feed. The director gives the general direction but I actually did that in their stead from time to time if you can even believe that. Anyway, the TD would pick a source, preview it with its audio using either a headset or a little loudspeaker but that was just to check whether or not there was proper workable audio. The CG operator prepares the text, something like "man marries his sister in Alabama", wraps the text in whatever graphics and adjusts the graphics as necessary, then feeds it to the mixer. The channel logo, the date, or time seems to be added by the mixer but they are a rather standard affair since they were constant and I imagine they just press a key to display a whole bunch of them at the same time. The TD picks the feed, applies the logos and shit, applies the output of the CG generator which was typically referred to as "KJ" (abbreviation for karakter jeneratörü" in Turkish), then the audio engineer, who happens to be the best AE I have ever seen in my ten years as a simultaneous interpreter and I have worked with so many, listens to the audio feed, does necessary adjustments as he sees fit, then adds my voice on top of it. The t-bar was almost always used in transitions of any kind and I imagine it is to give the TD complete control of how things go down, something I observed to be essential if you want things to go smoothly. All these were what I could deduce on my own since TDs were not so eager to teach me anything as if I was gunning for their job or something. I am probably mistaken in quite a lot I have said so please correct me if you know better I really wanna know. Stellar content btw.
@@AerinRavage The T-bar on my switcher occasionally has space laser sound effects. (Which I make to amuse the crew, or myself. The camera operators know what a cross-fade sounds like on the intercom: "Ready 2 with a fade ... Biuuuuuzt!. 2's up, 3 clear.") 😁
I was going to mention that myself... I saw Star Wars in 1977 but it wasn't until I got my first job as a Broadcast Engineer in 1985 that I found out what that cool looking control board actually was. I looked after three studios all with that same Grass Valley mixer.
I grew up in a TV family. My dad worked for CBS for almost his entire career, my uncle worked in remote trucks doing sports broadcasts, and now both my brother and I work in TV. He is an audio mixer and I am a colorist/online editor. All this is to say, that when I was a kid, every time I watched Star Wars as a kid (which was a lot) and the scene where they fire the Death Star laser happens, my dad would always point out that the controls they're using were actually some Grass Valley video switcher he was very familiar with from years ago. So, there's another use for that T-bar.
One common use for multiple program outputs is also live screens on set or for events with a live in-person audience. For example the big screens for the live audience at a concert are there for close-ups of the musicians, and maybe some graphics or clips playback as part of the "set" and/or lighting design, but they wouldn't ever get the wide-shot cameras that are used to show the TV (/stream) audience the scale of the event. Skilled Directors or TDs will even synchronize content across outputs to build virtual "layers" through the real world: the wide shot of the concert stage where the crowd is going wild reacting to the guitar solo they and TV are seeing in a close-up shot on the screens that flank the stage... Then as the solo ends, both the TV program and the venue screens cut back to the lead singer together for the next lyric. Thankfully modern mixers/switchers also have a lot of automation features to assist the operator/s in managing all those outputs.
Hey Gravis - love all your videos. I help TD for our local news station (CBS affiliate) as well as community TV station. You hit the nail on the head! When I'm switching I'll use the auto transition button the most but I like using the T-bar to control the speed of dissolves.. there has been an occasion every so often when I'm using it to stop a wipe halfway for a split-screen, but we don't use any wipe all that often. With news we'll groan over the stingers (too many times) but also some of those cheesy-type picture-in-picture swooshes. MEs are our friends too. Loved seeing all the different varieties you have - if you want some of that experience live-switching for a real show get in touch!, haha
55:37 SD is still suitable for those screens at concerts. It is not needed for a LED screen that far away to be HD. As a matter of fact, tomorrow I work at a huge concert "Musicshowscotland" in Rotterdam _(The Netherlands)_ with SD LDK cameras on huge LED screens. Works like a charm.
I remember being in the video production club in highschool and using these to produce the morning news at school. We had TV sets that piped our production to every classroom. I was thoroughly surprised to find out that almost nobody else had these in school.
I was also in a video production club in HS, but I graduated in 2020. I'm incredibly jealous because we did it in the most boring way imaginable: Prerecording everything over the course of a week, editing it in Final Cut, then uploading the rendered video on YT 😅
In my TV viewing experience, the stingers are most often used when switching between a live feed and a replay in sports content. Like, regular cuts are for changing angles on the same action, and stingers are for... ehm, time travel. Likewise, in news they seem to be used most often when switching to a remote location or a prerecorded segment. So, even when not done particularly well or seamless, they still serve the purpose of marking the particular cut as a "bigger event" than just a regular one, as a visual cue for the viewers.
I think it's also relevant that you expect people to only half pay attention to the news (and sports, to a lesser degree), and only direct their attention to the parts of the broadcast they care about. A stinger usually gets animated in otherwise unusual, full screen ways, which is good for alerting peripheral vision that something is happening
I'm a huge nerd but I ended up going to film school, I used to collect so many cameras and stuff but the dredge of adulting has largely made it difficult for me. Thank you for sharing your collection, I LOVE this stuff. One day I hope I can collect some older camera and editing gear to add to my workflow. Not for like A or B type shots but the real thing is WAY better then any tomfoolery in the NLE.
The reason that old gear was built as a "rackmount" unit was because the workstations in the control rooms were (and to some extent still are) 19" racks. Instead of having a unit that sits on a horizontal desk, the desk itself is sloped and has 19" rack rails built into it, then the gear is mounted directly into the desk. If there's no gear in a section of desk (e.g. the director's workstation), they instead mount a blank panel into the desk.
I completely understand why video mixing is so interesting to you, these devices have always fascinated me! I remember when I was young and I was invited to take a look behind the scenes of a big television broadcast station. It was so interesting!
Man, every video you make is so interesting. I can only imagine how much content you have just, sitting around. I'd love to see more videos where you just talk about random appliances and their history. Keep up the good work!
Gravis, that Ampex ADO cost half a million dollars back when it was released, consumed 60amps of 240Volt AC, and contained two 200 Amp five-volt supplies, and two 50 amp plus and minus 12-volt supplies! The mainframe was huge and very difficult to repair... Ask me how I know! :) We had four of them back in the late 80s!
holy crap - that's about what I figured on the price, but I can't imagine having four of them. I figured these would be a single object that was treasured and revered by everyone and you had to schedule time on it, haha.
@@CathodeRayDude We had several Abekas DVEs, video switchers, and DDRs as well. They were mostly analogue component video and parallel digital CCIR-601, before serial digital video! The video was run from frame to frame with high-cost low capacitive (2 to 5 picofarads at 25 metres, the cable cost several dollars per foot!) DB-25 parallel cables, around a max length of 25 to 50 metres! The Abekas switchers didn't have MEs, but had layers, and you would build the images up layer by layer and then record it off to a 60-second video disk hard drive playback and record digital disc recorder. These were all multiple million-dollar solutions! I had to sign an NDA to never speak of how they were connected together, as that was an in-house production trade secret! serious stuff back in the 80s and 90s!
To add to what @dan b said: Each ADO unit he described was a *single* DVE channel! These were a separate piece of equipment to the switcher whereas modern switchers offer DVE on every upstream key. Each ADO channel was about 10-12U (?) high of rack space at least, about the size of a bar fridge. You’d get two in a full height rack. The 4ME GVE Kalypso I used later on was smaller than 1 ADO. They worked just like a Character Generator would with a mixer. The ADO provides both a video output and a key output. The key output is a black and white video channel that the mixer used to dictate the shape of the key. This is known as External Key, you can even do it "manually" with two cameras. Mixers would either have specific key inputs, or inputs programmed to be tied into video and key pairs. The ADO terminal you showed a picture of at 27:58 @Cathode Ray Dude [CRD], is the Z80 based microcomputer controller as described in the manual. This did the math part of the effect geometry, key framing and file storage. The ADO terminal fed the geometry to the ADO via a serial data connection. The data was real time - move the picture and you could see the numbers change. You can edit your effect by text input and even had copy and paste functions so you weren't doing everything by eye. Because these things were so expensive, the terminals had a resource sharing system. You could “acquire” and “release” control of a DVE channel depending on how many you need. That way multiple studios or edit suites in a facility could use the ADOs as needed, using centralised video routing or just a patch bay. A TV station I worked at had 3 ADOs, shared between a studio and an edit suite. They normally sat at a 2:1 assignment, but the studio could borrow the third ADO when doing a bigger show. I helped one of our engineers replace a PSU one day. I remember us joking “200 amps would kill you, but the 5 volts would do it real slow”. They're the size of a small car battery. All linear PSU technology. The internals were amazing electronic design too. The “motherboard” was a backplane built with wire wrapping. Every connection is an individual hand placed wire. All the same colour too. While the quality of the video the ADO produced was quite dated by the time I used it. The unit as a whole was a joy to use. The terminal had a purpose built key layout and a full screen to display geometry data on. DVEs built into mixers all had to shoe-horn the data display into what's available in the mixer. Large scale units like the Kayene and MVS have a PC running their menu systems, but the DVE data still has to work with the overall layout. Like @Cathode Ray Dude [CRD] says: these things are a conglomerate of user interface and a great study in the possibilities to achieve the same outcomes.
Saw your mention about radio automation software. Even the older stuff can get absurdly complicated there - interacting with GPIO/RS232 to mixers and even ISA codec cards on the ancient stuff. Looking forward to see what your video on that’s like and somewhat hoping I don’t recognise the equipment! 🤣 I’ve got a few videos on the @technicallymindedtv channel if you’re thoroughly bored covering broadcast radio with a UK bent.
The Datavideo SE 500 was considered a high end piece of kit here in the Philippines even up until 2020 before the ATEM Minis took the world by storm. Got to use one in 2013 in high school switching with two consumer CCD cameras and a Macbook converted from 1024x768 VGA to 480i Composite and loved it haha. It’s fascinating how you can score a full HD video mixer for just $300-500 now thanks to the need of consumer level streaming brought by the pandemic
I used to have an MX50 in a university video studio I worked in. You can get a straight cut between A&B busses by hitting the auto take button on the right of the T-bar, when the transition knob is set to minimum (all the way to the left). I do remember using this mixer with a programme / preview workflow - if I recall correctly, the preview output displays whichever bus isn't currently being sent to the programme output. I can't remember whether the auto take swapped the busses, or just simulated the T-bar switching over - I think the latter, as I have a recollection of the T-Bar's physical position being out of sync with the current programme output - hence the little red LEDs on the left of the T-bar to indicate which bus is live. Thanks for the trip down memory lane!
I was the TD for several small live video productions at a university during COVID. I operated four Panasonic remote controlled cameras, as well as doing video mixing with a Blackmagic ATEM Television Studio Pro 4K. A bit overkill for four cameras but I'm not complaining. There were I believe six other inputs, but I never used them as they were video sources from computers and unplugged cables - I swear one went to a security camera backstage.
That might well have been a security camera. I could see one benefit being so you can tell who's gonna run out from behind stage. Or what mischief might be going on. Plus a security camera is way cheaper than a professional camera.
I work at broadcast events for eSports across the globe, especially for CSGO and fighting games. We still use a traditional video mixer, generally a large blackmagic one just because we have one, but every passing day we are automating our broadcasting, whether it's hooking to an API to get game information and switching automatically or synchronising AR with cameras, lights, pyrotechnics and the like. We most generally use vMix nowadays. A LOT of vMix. Like, ~5 instances of it at a time interconnected at different parts of the venue, to create one cohesive broadcast experience. It feels like the general trend in gaming broadcasting is going towards having one "prod table" that takes care of EVERYTHING and connects directly to vmix to switch things around. Especially in smaller, one game one stage broadcasts we are completely skipping over the video mixing at times and connecting all cameras directly to a Decklink and using the XL streamdecks with Bitfocus Companion to use as a glorified mixing board.
God I'd give anything to see a full video on that Ultimatte model you have, that thing looks more like some vintage test equipment and I'd love to see the quirks of pre-PC chroma keying
It is apparently an INCREDIBLY advanced piece of gear. So advanced that it's either broken or I don't know how to use it. But trust me, if/when I get it working, it'll get a video.
@@CathodeRayDude i hung on to my cheat card for the ultimatte iv for the longest time through several moves - I’ll see if I can find it. It is really hard to get it going just by fooling with the front panel till you close in on what you want. I got myself in real trouble once just thinking I could wing it first time out. later, mid 80s they came out with the ultimate 300/news, Matt, which was smaller, and only had a few knobs it much easier to just do the basic weather map effect while retaining shadows.
@@CathodeRayDude I found it! The quick setup guide for The ultimatte IV and the troubleshooting guide on the other side. I even have two copies so I can just give you one if you want the original one on cardstock. I sent a scan to your gmail
About 10 years ago, I got the wild idea that I wanted to "run an analog TV station" as an art project for an annual event. While I researched all the Broadcast (AKA radio) details, I scoured eBay and local auctions for anything super old (eg: affordable), but never found anything. I also needed at least one other person to help with everything, which I also came up empty on finding. Years pass, and it never ends up happening. FF to today, and watching this video has reinvigorated my desire to make it happen. Time to start my searches again! Thanks for the info and the spark needed to reflame an old idea!
I'm learning VJing and modern video mixing in software, so this older equipment is a really fascinating insight into the history of effects systems and their UI design choices. I can see some interesting parallels like the large main lever on the right side of most units, which gets an equivalent analog lever on almost every recommended input system today going into a PC
I found a game recently on Steam called "Not for Broadcast" it's about running a video mixer for a news show, making sure you keep the camera on who's talking, bleeping out cuss words, etc. there's a free demo which is how I played it.
45:43 I am a professional camera operator for too many years. And I must say that those camera, with the million buttons and switches, are easier to operate than the consumer ones. Where you have to dig through menus to get things done. Of course, nowadays there are a lot of menus in modern cameras. But to actually operate a camera is easier with dedicated buttons and switches.
That is true, but it adds to the cost. Also I suspect all those moving parts are more opportunities for failure (e.g. from dust and perspiration getting in).
We use DVEs mainly to set two seperate video feeds in a PiP-style setting, mainly controlled by the T-Bar for coarse positioning and then leave it there as we take the shot. Works great with artsy shots for jazz solos with a wide and a close up or even fancy music related live content mixed with the actual musicians, e.g. a circle of weird colors from a seperate feed off the LED wall and a wide shot of the same LED wall but with the band infront of it. But I mean... that were 2 out of 150 productions of last season, so DVEs are mostly just for kicks and to please creative enjoyment of our directors :D
If you are talking about analog composite video. If the two signals are synced, you can actually mix them with simple summing. That is how that old rack mount mixer you have there works. It's not doing any frame buffering or anything, it just switches video and will do summing if you do a fade.
But that requires not just Genlock, but also synchronized phases of the color carriers? And, of course, it only works for NTSC and PAL ... which is why even in France, most studios used to work with PAL or component video internally and only converted to SECAM at the very, very end of the pipeline.
Short answer: no, you can't do simple summing. The longer answer is, if you have two perfectly synchronised monochrome video signals (which requires your sources to support genlock) and you sum only the video portion of those signals (not the sync within those signals) and divide the output by two, that would work. As soon as you want colour your signals must be RGB with each channel summed separately. But for composite colour video such as NTSC the colour is encoded in the phase of the luminance signal at certain frequencies. You could try synchronising the 3.57954 colour burst, and good luck with that when a quarter nanosecond difference will produce a major colour shift, but even if you could do that when you mix you'll see huge colour shifts anyway.
I was the lead producer on my middle school's TV station from 2018-2020 and we used a Panasonic WJ-MX50 every day - we were a 4:3, SD production house, using three Panasonic Mini DV camcorders hooked up via composite to the switcher, and to three small B&W preview monitors, with the program display output going into the JVC version of the CBM 1084 CRT monitor. For some reason, we also used an external audio mixer. We never got to add the character generator, but I would've loved to and want to get my hands on my own. I do understand your issues with A/B but that's just always how we did it and it always worked for us (with only three cameras, you could always keep your eye on all three - the morning news show had camera for boy anchor, camera for girl anchor, camera for both anchors, plus a fourth VTR input).
Back when I was pulling cables for ESPN it always boggled my mind that the tech director was (on some level) tracking a dozen cameras. They had assistants to help with comms and spotting and everything else but still ! A dozen dang video feeds and one person is deciding what millions of people see
A few years back I learned what was involved with audio for sports ball games. same thing one person is literally listening to multiple microphones. And I do audio mixing for bands and I feel like I can't comprehend it. And then you throw surroundsound in.
I think the grand-daddy of all video effects units came from a British company called Quantel. They had products with names like “Paintbox” and “Harry” (?) with six-figure prices, that could play around with video, in real time, in quite mind-boggling ways. If you look at old Kenny Everett TV shows, I think a lot of the mad stuff he did there was done with a Quantel machine. Then there was an earlier product called “Scanimate”. This did all its amazing video warping and shattering and stuff with entirely analog circuits. For a brief time, they were apparently renting this thing out to customers (mainly producers of TV adverts) for $10,000 a day. I think there is one unit left still functioning, in the garage of a former engineer of the company, who has a RUclips channel where he shows it in action.
WOW! major flashback dreams about that Panasonic switcher in the front row! That was one of the first switchers I used in a mobile production van! They were tanks, and almost never let you down! They didn't do much, but what they did, was awesome and very reliable! Loved the unlockable T-bar Mix effects / wipe lever! Funtimes!
I did public access back in the late 80s through early 90s - our video switcher board had all sorts of features, but was still only capable of doing 480i. I did the whole producer training, and helped produce many episodes of a local zoo show, as well as a lot of sports events at local high schools. We even had a couple of Video Toasters... but almost all of our studio cams still used photomultiplier tubes (CCD was out but was really expensive then) so good luck doing anything fancy. :D
Since people seem to find this interesting I'll call out some of the gear we had in each of the studios (two large ones and two small ones). We also have several mobile truck studios which had similar but more condensed sets of more or less the same. Almost all of this stuff was rack mounted in a series of panels behind the switch board area (but within reach) and along the side wall. - the main video switcher - much larger than the ones shown in this video - ours had at least a dozen wipes/fades/etc and several modes to insert an inset (used way back in the day to show a scoreboard of a sports event in the corner of a live shot). Pretty nifty for the era, although far from top notch for the era (we were a public access station with decent stuff, not a major network affiliate). To run this, you needed to demonstrate mastery - this person has the greatest ability to make a goof, especially if you're doing live (for sports events, we often did). - VTR units - i.e. VCRs but of the U-matic 3/4" tape format - you'd have one unit set as master record, and several others where you could queue up various B-roll clips or the on-site shot you pre-edited; have your host introduce the video then cut to that tape, back to host once it's done. repeat, etc. - sound loop players - little short loops of audio in nice little tape modules - insert a pre-recorded audio advertisement, etc. Push the button and it immediately began playback - once done it would stop and be queued at the right spot for the next time - pretty neat for back then. - SFX/audio mixer board - same idea as the video switcher except lots of channel sliders; similar to a band's touring mix board but only a dozen channels or so. Nothing fancy like recording studio board (those have dozens/hundreds of channels). - master signal generator / genlock - a rack mounted master clock - it sent a signal to all the video gear and cameras to keep them in sync. A requirement for the old NTSC standard to avoid any flicker. - light mixer board - control the various lights in the studios - this was pretty fancy as we had a variety of plain roof lights, smoky gels, indirect, etc. Sadly not configured to use that protocol where you can send signals in real time, so somebody had to run this when you were doing a host intro and fade out at end of episode for the studio stuff. - studio cameras - generally three huge tripod mounted monsters - they did have a limited ability to move so we would set them up for a show then leave them in place. Wonderful setup - preview display on top with red light if master was using your feed (so you'd be careful not to do anything dumb), connection for headset so you could whisper to control room and hear their instructions - "camera 2 get a close up of guest", "camera 1 get wide shot" etc. while cam3's feed was being used. They had the nice hand grips and were very well balanced, so it was easy to aim it; zoom was a thumb control on one hand and focus (haha you expected auto-focus?) was a twist on the other hand. If you aimed at the lights for more than a split second you would ruin the tube, which cost at least a few hundred bucks to replace, so just don't do it. :D - a whole row full of preview TVs - one for each of the VTR units, one for "master out" and several others. - shoulder cameras - we had a few smaller ones more along the size of news gathering gear - but that took U-matic tapes, so were bigger than a VHS camcorder. Once again, don't point at the lights - although these were a bit more tolerant than the studio cams. - giant room full of audio cables and reels of video cables - audio was normally done via same stuff professional audio people use, nice rigid cables with XLR connectors that actually clicked and stayed connected :P Video was done via these at least 1/2" across cables containing 40+ conductors - the end connectors plugged in and you twisted to lock it in place. - Going out with a truck to do a sports event was similar but often with less gear - 2 or 3 cameras for volleyball - maybe up to 5 for homecoming football. The "fun" part about those events was needing to do setup and teardown - and making sure to place your precious cable bundles somewhere they won't be constantly walked on by people who just can't seem to be bothered to step OVER the obviously very expensive cable. I wish I was kidding - those video cables were like a magnet for "moms with high heels walking out of bounds because they think they can" and we'd have to shoo them away. :D This was the point in my life when I learned about rigger's/gaffer's tape - like duct tape but much much better (even if more expensive).
@@chouseification it's like what they say about fiber optics, The yellow digging machines will come for it. I did audio mixing for a band at a race (running, fundraiser, nonprofit) wouldn't you know it of all the places the pathway decide to form right where all my cables were going from the Mic's to the mixer and I didn't have a snake or longer cables at that time.
@@imark7777777 yeah that's how it always works... and if you'd known, you could bring those big rubber guard strips (heavy monsters that they are), but venue often claims "clear safe path" which means "only family members have access" but they're nosy and step on things, or worse it's near some VIP area and these folks think their access to the fancy bar somehow allows them to destroy your gear. Yikes. As to fiber optics, I was working at an ISP around Y2K and during some digging work, a major fiber between POP sites was cut near my metro area - luckily we didn't drop completely due to several other links but the traceroute was odd until that link came back up (some poor tech had to try to splice the old fiber or maybe they had to run it again between nearest transceivers). :P One of the oddest shoots ever was for a really awesome cover band called The Dweebs from Wisconsin. They're really talented, and can do a whole lot of songs. We drove way the hell to Lacrosse to film them do an Oktoberfest show - excellent show; great sightlines, we had a direct tap into the audio guy's board, only problem was two of the three cameras (to be able to film over the large crowd) were in the balcony... a wooden balcony built by who knows, that was bouncing every time these way-too-drunk (it is Oktoberfest) drunk peeps who were some special club members for the festival, who thought that got them the right to be there attempted to "dance". Sure if they stood mostly still and silent no problem, but they were bouncing around like a bunch of fools, and I wasn't really feeling happy about the structural engineering of the wooden balcony. It took some fetching of organizers to get them to shuffle off to somewhere they could annoy other drunk people, but eventually we got the loft back (we let a few chill folks remain with us though, no need to be mean). The camera down on the floor was doing a variety of crowd shots and band closeups (we knew when specific folks would have solos coming up, etc) but that was just a pimp daddy camcorder securely strapped to whoever's turn it was to do that rig. :P
@@chouseification yeah wasn't that fancy for the rubber cable guards at that time as I was literally 5 to 10 feet away from the band. Although I did do the music tent for our small local county fair (minus the animals so we call it an expo). we had a fiddle contest where all three judges decided that they were going to move into my tent and act like they own the place. And then they complained about being behind everybody and not really able to see the players, shockers it's almost like they were sitting behind the stage. Oh wait they were to the left. I do an international water tasting for 3 years now and I love offering up press feeds. And occasionally will have other groups come in and film like TV, documentary etc. I've only had one person who understood what that meant where I didn't have to explain and they took me up on it. I've also edited videos where having a direct feed from the board is so much better than having the room audio from a gymnotorum. Oh yes the fun situations that we find ourselves in. I got into a concert at a nearby local theater free for helping them tear down. Not only did half if not all of the people coming in have earmuffs on and hearing protection which made the entire thing a horrible experience for me the whole thing sounded better in the lobby over the fallback system. Oh yeah where was I going with this the old theater balcony needless to say started shaking with the spectators even made one person nauseous. It's nice that you got them kicked off and it's also nice to keep the responsible ones. Rickety structures ekkk. I was asked to film today or yesterday now at my local Art's Place, they had a kids art thing they did in their black box theater which has four posts and guess where I was I just barely managed to get angles and on top of that they had a band set up and a completely other weird angle that I had to jump back and forth between with one camera. I did it free as a favor so didn't necessarily feel like nor have the time to grab the second camera. Although I did get a compliment from the new Director, on how unobtrusive and how I got a decent angle and wanting to work with me again. The funny thing is under the previous Director I used to volunteer at the digital media center that fell apart consisting of a 3 cameras Community TV studio. There's a really small town near where I live and every so often they lose phone+911 service because the line somehow breaks between here and there. Lucky you still had service from what I understand a lot of lines get consolidated down to the point where they might've been redundant on either side of the street but after consolidation from some penny pincher are now running through the same cable.
I love this video! I've worked in video production for 10 years and its neat to see someone so interested in the most mundane part of my job. Some little notes I have about my personal experiences and what you talk about in the video.( Disclaimer I did not go to school for any of this, I learned as I went. I got a job on a video crew and 10 years later its now my career, so I might just be some yokel and I’ve been doing everything wrong this whole time) 1. Its funny that you mention running a broadcast with a dinky mechanical switcher but I have actually done that once. I had a baseball game where a foul ball struck my tricaster 400 and it would not turn back on. We used a 4 way SDI switcher to run the rest of the game and cut back to a camera pointed at the scoreboard every play so the viewers could keep up. 2. When it comes to transitions we also only use cuts or fades (or whatever branded stinger ESPN gives us(also we don’t call stingers transitions we call them “video cuts” not sure if that is normal or not)). My understanding is that the goofy transitions are largely used by churches, who have lots of money for the expensive switches but are not really run by artistically minded professionals. I have done lots of “live editing” using a mixer and never once used a transition besides a cut or fade. 3. We do the same preview/program switching for sports but usually leave one teams coach on the preview so I can grab the reaction to a bad play. 4. On the topic of sound mixing, it is by far the best job in the industry because you get a room all to yourself and nobody bothers you as long as everything works. You also get to hear all the gossip reporters chat about during the commercial break. I miss my sound booth days. 5. The T-bar is good for emotional interviews when you want to really “feel” the transition. I don’t use it often but many people I know use it exclusively, it might just be when you got into the business. I love the cut button. 6.When you mention big sporting events and how complicated they can be. Usually there is one or more co-piloting producers there to watch the feeds and help pick shots. Along with multiple replay operators it helps to not overwhelm the director with each individual camera feed. Sorry that was so long.
Okay, so yeah, the T Bar tends to be for when you want to do a variable fade on the fly. The auto trans tends to be your best friend as a TD. With regards to DVE transitions... as a local news TD, I typically only use 2, all the others tend to be "cheesy" imo. The two I do use pretty regularly are a lens flare effect (almost like 2 stars flying across the screen with a lens flare in the middle) and a page turn. I use the lens flare for lighter stroies, or for certain fullscreen graphics (like "If you or a loved one is in need of help, call this number"). I use the page turn to transition between multiple graphics in the same story. I guess I also commonly use a white flash, but I don't think that technically comes from a DVE since it's just a color being generated by the switcher. Anywho, like I said in my other comment... these things tend to go through automation in local TV news. So we have some central servers that feed commands to the switcher to get it to behave a certain way. It's really interesting how many machines work together to get a production going. For example, that loud thing you had, we refer to it as the frame. It holds ALL the technology for the switching, as well as several other things (oftentimes it has a multiviewer that allows you to view all your sources that are plugged into the switcher). The actual board with all the faders and buttons is mostly just something that allows you to manually interface with the frame, and isn't even required on many modern switcher setups (due to the automation aspect of things). These switchers also have the ability to send commands to your video server, graphics server, etc. (usually referred to as custom controls or macros). This allows you to pre-save some of the effects or transitions without having to manually recreate it every time. There's a lot I haven't even touched on, but the technical nitty gritty is what has kept me in the industry because the technology that goes into it is fascinating to learn and become more of an expert in. Thanks for the video! Even from my knowledge and perspective, it was really insightful to see from a more general technological pov. Your videos are truly fun to watch.
Thank you so much for enjoying the video and for the info. I've heard a lot about how modern productions, news in particular, have almost no human beings involved during the show other than the talent, and it sounds really depressing compared to the chaos of yesteryear. I would really love to have been involved 15-20 years ago.
@@CathodeRayDude yeah, they've whittled it down in most newsrooms, taking out the TD (with a Director only), graphics op, etc. Some have even gone as far as getting rid of the camera op, audio op, and teleprompter. Luckily, the newsroom I work in is led by a guy who wants just the right amount of automation because he'd rather have human errors than technology breaking constantly.
Interesting I have not heard of the term frame being used. It is fully understandable and I suspect it is some thing out of the telephone industry since a telephone switch is usually put together in frames which is also where we get the standard 19 inch rack spacing. It's interesting how the technology is overlapped and the terminologies drifted into other use cases.
@@brycejprince As a TD working in a (mostly) fully staffed non automated studio, I hope they never change. Troubleshooting a sleepy cameraman is easy "Why did you have camera 1 and 2 switched at the start of the show" "Sorry I read the shots on my rundown backwards" "Understandable, our show airs very early in the morning and waking up takes time, just try to pay closer attention tomorrow, ok?" Versus potentially hours of fiddling with stuff and trying to reproduce an issue. We had one of our ME busses go 180 degrees out of phase on the colors yesterday (red became blue and blue became red) and engineering is still string to troubleshoot the issue over 24 hours later.....
Mixing console chassis were custom made furniture for these rack mounted devices. These racks were set up at angles to allow usability as the button farm grew. This was also when you'd build your own component setup. "I want Panasonic switching, Crown Amps, Sony Effects", that kind of thing. It's akin to guitarists building their own rack effects/amp combos back in the day, even pedal board combinations have now given way to amplifier sims. I was a pro guitarist back when all you needed was great hair, we used to call them spice racks. But back to the video side of the world, the days of custom builds like that are over. Modern products have everything built in, even the slope up/stacking in the chassis for the button farm. :)
Gravis, on modern video switchers the automation systems such as overdrive are fed from the newsroom automation system, for example, Avid iNEWS, and all of the customs, video frame buffers and video playback, CGs, teleprompters, audio boards, routers, under monitor displays, etc are all sent in real-time to the switcher and everything is under automation. the TD basically then just hits enter, next, next, next, next, for the entire show. This is my main job at work now; the care and feeding of the automation systems.
I've seen fast wipes used by NHK for news and emergency broadcasts. But that's the only place I have seen them. In Europe, news broadcasts use hard cuts and it might be because we see stingers as the exclusive effect of sports broadcasts.
I used one of these back in the day for live editing weddings, theater productions, and my girlfriends' figure skating videos and then offer DVDs made before the event is over, or for live-editing skateboarders that goes right to a big screen at a competition. What a throwback, you're gonna get me to buy one of these for no reason now.
I just watched the entire video. This is really stuff I've been interested in. I REALLY wouldn't be surprised if companies used this video as an actual training video, even if you mention yourself that after watching this video you might not be able get a job as a TD. I feel like this could actually be really helpful for new people in the field, to understand the basics of it. Thanks for making this video! I can tell that it must've taken a lot of work.
The broad strokes are correct, but you can also definitely tell that this is by someone who hasn't actually worked with the equipment in real life use and there are several inaccuracies.
Also, it's not really a field of work you can get into by watching training videos. I don't know how it is in other parts of the world, but at least in my area, a bachelor degree is pretty much required to be considered for a vision mixer/TD job in broadcast.
On the Panasonic switcher, you should be able to set the time for the transition to the minimum, then use the button to cut. That was preferred on the one at our local public access station. We had one for portable setup's with two cameras. The reason to use it as a cut button is it has a built in TBC (Time Base Corrector) for each buss. Your inputs do not need to be genlocked. A test you can do, set up two inputs that have action. If you switch between them on one buss, there will be a delay of about one or two frames as it sync's the video. That didn't happen when going between the two busses. Hopefully they fixed it on later versions, but I doubt it. A quick edit. The built in TBC only synced the signal, no other adjustments that you could get with a good external TBC.
On an early video switcher I used at a public access TV station, you HAD to use the take button. The switcher only had two frame synchronisers and the cameras we had didn't support genlock. So if you did a hot take to another input on the program bus there would be a huge tear in the image.
TD here too! In my past life I used to TD for live horse racing, and the wipe was totally utilized a lot. We used the horizontal wipe for races about halfway. Cam 1 up top would show a wide view of the field along the top half of the frame. Cam 2 would be tight on the first few horses on the bottom half of the frame. Stick a horizontal wipe transition halfway between the 2,with a nice feather and what you have is the entire field of horses tight, and wide. I would also use star wipes and the other wipes when I was bored.
OBS can totally do the trick! I just recommend creating an overlays scene and adding that scene on top of the other "CAMERAS SCENES", and then, enabling the studio mode, configuring shortcuts, and running the multiview window. Now you have a virtual video switcher! With the OBS web socket plugin, you can even have tally lights on your smartphone!
@@CathodeRayDude I think that what gives that feeling is "sharing" the same keyboard to the shortcuts. It seems like you can't use the computer for something else, even though you can. Also if you're covering sports and need quick and reliable replays, definitely Vmix is the way!
And if you throw a touchscreen monitor in the mix for your multiview you now have touchable cubes. I also use a numerical keypad for cross fades, cuts and camera selection.
@@felipeamdd I found out the hard way during testing that if you bind to a key like say A through Z you now can no-longer rename things.... As sometimes you'll get that letter and sometimes you won't but you will also be activating that shortcut.
@@imark7777777 I solved all these problems by connecting two keyboards to the computer and using a software called "HID Macros", basically you can assign macros to only one of the keyboards while the second can be used normally. We used to stream our church services with this setup, and for that, I used a full-sized keyboard for the macros. Now we have a NeoID Studio 6 video switcher, so all the heavy cutting is done when arrives at OBS and we just need to control lyrics overlays and PIP for the Bible reading. So I moved all the macros that we still use to a small numpad-only keyboard.
Director/ TD here, great video! You got everything basically right! You're correct that the millions of wipes are hardly ever used. Manual T-Bar would be used when you want a slower fade/ transition done on the fly without reprogramming your preset transition duration. Also, in some circumstances, I will leave one source half faded over the other for a duration. You're also correct that you could do a rudimentary split screen in a pinch if you wanted to, but I haven't been in a production that has done so. For modern live broadcasts, say live basketball on ESPN, the real magic happens in the graphics computers that feed the switcher all the fancy graphics to your DSKs or MEs. The actual switching of many live shows is pretty formulaic, but the graphics are where it gets complicated.
Someone may have already mentioned this and I just missed it, but regarding all the rackmount stuff: bear in mind it's not just vertical enclosures; there were also desks designed to hold rackmount gear horizontally or at a slight angle.
There’s so many different discussion topics that come from this video. Technique, creative style rules, deeper history and even user interface. There’s so many points in my careers (plural) where having a control surface made for purpose would be an invaluable addition next to a mouse and keyboard. @Doug Johnson Productions made a great video about custom controls for video work last year.
I used to work with the modern versions of this equipment while running tech at a theater. We had both a very small simple 4 input board for our projector and a larger board for our live streaming system (a God send during covid). I am so glad we are down to 1 cable per beautiful HD video signal now. I can only imagine the rats nest of wire and the absolute pain and suffering of having to diagnose any problems with those older systems.
I am so glad I'm down to less than one cable. I work for a guy who wanted to reuse what he had to do a "professional" streamed +Live event. I'm talking using 2 mevo gen2 cameras and 30+ seconds of delay hacked into OBS. (I believe when he asked me what camera to get I said mevo gen2 that was out at the time before I started working for him with the explicit they only support one camera at a time and yet). I finally talked him into getting the newer ones that supported NDI amazing. Then he wanted three cameras and hey guess what he already has a camera we could just use that right... More than the minimum 30 feet of HDMI cable couplers and headaches different cameras sensor differences.... I finally talked him into getting a third. Which means I have one ethernet cable going to the front and a POE switch that powers 3 cameras. Tracking down a fault in a cable vault had to be maddening and then there's cross talk and interference.
55:10 I know the Stream Deck and other painfully expensive switchers exist, but if I were a high profile streamer using lots of cameras, I would love to use traditional mixers. Maybe it's for the blinkenlights and big buttons with positive feedback, but also because it just makes sense and looks cool in the process. Even if the mixer was just an interface device for software on a computer, it just makes sense. You just punch a button in a bus and you're previewing or streaming that camera, no need for a mouse or digging through menus. Sadly, from what I've seen, that's not a thing that exists, at least on a decent scale and price point for consumers.
Agreed, I'd try to find suitable little set of crt monitors to go with it, make the setup part of the show, so to speak. I might not currently have space for such a rig but I love the idea
Those special effects generators were rack-mounted because they weren't intended to be used "live" by the TD. There'd be one or more of those mounted in a rack along with the VT decks and other equipment and it would be set up before it was needed to generate a specific effects shot like the titles, credits, or the key for the weather report. The main switcher, which might have been an analog unit that was already in the studio for years without special functions, would cut to its generated output when it was wanted. The big Sony panel that was multiple switchers in one, that was collapsing the functionality of several of those into the main board in easy reach of the TD. If what you needed was effectively extra mini-mixers to feed the main mixer, the obvious form factor was a rack mount because you might need to cascade them or make them share inputs, but then main board doing the bulk of the work would be a console-style mixer.
Software engineer here, and I super relate to the phenomenon you're describing. About things seemingly so complicated you can't comprehend them, and yet once you understand the pieces, it's actually relatively simple. I feel like that's how all of software engineering is. If all you ever written code wise is programming exercises like a bubble sort, writing a full software program seems impossible. But once you understand how to take everything, literally everything, and compartmentalize it... So much of my day-to-day work is about as complicated as that TD board. Metaphorically speaking
i did an apprenticeship in media production (in germany that means you work for an actual company and get paid while also going to school every couple weeks) and i was (un)lucky enough to be put in a school where all the live production equipment was from the late 90s (i graduated in 2021). you'd have a BLAST looking at all the ancient shit we had to use. there was a room with 6 19' racks all the way to the ceiling that was the backend of the control room. video switchers really do come in all sizes and shapes, nice collection. i learned operating a larger switcher with all the bells and whistles, and LEMME TELL YA doing that stuff in a live environment was one of the most stressful experiences in my life. it was fun looking back, but jeez man. im pretty sure every second hand switcher you buy had a decent sweat patina at some point in its lifetime.
Very cool stuff. I'm primarily used to tools like OBS Studio but it's cool to learn more about the tools that came before. It is funny how many awful looking transitions there are in these mixers. Stingers really seem to cover most bases. I run a live speedrun event every year and this year I'm going to be making my own with my own filmed elements - sure hope that works!
OBS actually has studio mode which is based on the preview, broadcast thinking. I've understood that a big reason why people still use physical mixers instead of PCs is because chromakeying is much higher quality on dedicated hardware. I've wondered if OBS could use a full control mixer as a keyboard
I'm well aware of OBS's weaknesses, and while I'd like to get a hardware mixer someday, a) latency doesn't matter much for my projects and b) I'm not a professional. It's a pretty remarkable tool though, and the fact that an open source and free tool can do so much of what a studio mixer does is pretty amazing.
Great video! I learned a few things and I'm something of a TD myself, at least until the billion dollar corporation I work for changed my title so they can add new responsibilities without adding any pay. It's funny you point out the seat of the pants capabilities of a video mixer. The switcher I use costs half a million dollars and I never even touch it. I'm told that with the fancy system we have, I can do all kinds of things on the fly, but I feel like it's all on rails now. I do my job with a mouse and keyboard using software that automates everything. There was something romantic about having a full crew in Production Control, punching shots as the Director calls them out. Working as a team with the camera and audio operators. Yelling at the prompter to keep up. But those days are over for a lot of us. Now the cameras are all robotic and it's my responsibility to program them and keep them from crashing into each other, the more complicated the show, the more disastrous it can be to make changes once on the air. Nowadays I compose the shots, set the focus, put them on the air, run the audio, punch the graphics, cue the talent, roll the breaks, coordinate with the Hub during live network events (since they shut down our master control), sometimes I even have to time the show. Even the talent scroll their own copy with a foot pedal now. There's no one in the control room anymore but me and a producer. So I don't feel like the huge switcher we have really gives us half a million dollars of flexibility. On the other hand maybe I'm just not using it correctly, often we'll get new equipment (like robotic cameras) and literally only two days of training for it. Don't get me wrong, I love my job but it's changed a lot in the last 10 years. I feel like the last man standing. I have friends at other Stations who still run a full crew, but I'm sure their days are numbered.
the switcher isn't quite the heart of the broadcast studio anymore, that'd be the video router, which is basically just the world's largest video matrix, since so much of what gets broadcast is controlled by automation servers rather than ops, even live tv has a solid chunk of automated sections. usually you have every single imaginable video source and destination connected to the router (the biggest baseband one ive ever seen took up two entire 42U racks and had something like 2048I/O, and the smpte 2110 ones can get even denser) and then the switcher gets only a subset of those, maybe 12 or 24i/o as relevant to the program.
we have a (small) 512 squared SDI router, and a couple of 64 and 128 squared routers for QC monitoring. I worked for a network broadcaster that had a 2k square video router!
Routers can be huge, but I would argue the switcher is still the hearts. Routers are one-trick ponys and can only do cuts (and not always clean cuts) so you actually want to transition between sources through a switcher. Most of the cool things you want to do (USK, DSK, fades, clean cuts) are coordinated by the switcher.
Absolutely love the videos about the different cameras and what makes them unique. Not necessarily interested in that stuff as a collector or whatever... But I've enjoyed them thoroughly and rewatched your videos several times. The camera and gear videos are probably my favorite
30:20 sidenote: ever saw a PowerPoint presentation made by someone who just discovered how to use transitions and every slides have different transitions?
Professional TD here, mainly on Grass and Ross switchers. WIPES are kind of a misnomer anymore. We don't use them to transition between sources much. We use them all the time to create effects during preproduction. Not to actually wipe between sources during a shot, but you can create a circle wipe to wipe between a slightly darker version of an image to create a spotlight effect or I used it one time during an ESPN football show to put a circle around our location on a radar to show that there was a big storm cell about to pass over us. When you combine a wipe with an effects dissolve on an E-MEM, you can even have all of this animate. I have a couple of moves in my show just in case I need to pull out an animated vignette really quick or something.
This might be a hot take or completely true, but I have the feeling that OBS could be modified to be an entire production movie mixer. All it needs is a way to get a lot of sources in and a control deck. I think that's incredible considering it's free software. I could see entire productions being done on OBS in the near future, and to and extent I could argue that's already happening.
I will say however that reliability is nowhere near as good as production mixers, but it's certainly a lot better than any other option out there for low cost productions , at least at the moment
The problem with OBS in my experience is that it's too stupid to understand how to switch sources reliably. In a mixer, there's an abstraction layer. All the sources are constantly being ingested into framebuffers, so you can pick whichever one you like. In OBS, if you switch inputs on a source, it takes 2-4 seconds to initialize a new video device; and if you try to use the same source in two places, you can wedge the whole app. It's just not designed for it, and while you could _sort of, kind of_ hammer it into submission, it really just isn't built for this.
@@CathodeRayDude there's nothing saying it couldn't be redesigned to do things like that(open source software invites that), there are also dedicated PCIe capture cards, but it seems like the way it is now is not ready to be used for something that important. Although I could recon the "computer bs" could be fixed if you modified the OS to be dedicated for streaming and nothing else, but at that point it may be better to use something else, not entirely sure. I know Linux was used a lot for computer aided DJing and the first ones used a custom Linux distro designed to be dual-booted with windows xp. I could assume at some point someone has locked down their windows stream box or used Ubuntu studio to get more reliability, but OBS is OBS.
@@JessicaFEREM It could probably be possible, but would mean a redesign from the ground up. Big OBS updates are enough work already, you would need to have a paid dozen of programmers to realistically make such a big change in a reasonable amount of time. And you're probably right about the OS too. Windows has a lot of overhead, but Linux has also a lot of overhead in comparison to dedicated FPGAs. That's why there's a market for hardware mixers. You can strip down Linux to its bare parts, but you can build something else better anyway in that case. But I'm not a software engineer (yet) and I have no experience in OS development. I was a TD in high school, and because the productions were so small there were a bunch of things that weren't industry standard.
Thank you for taking time to Malle this video. Very fun to watch and very informative as I look into streaming and potentially working with similar gear.
Actually most mixers in the consumer range could not make dissolves as it requires quite complex circuitry to do that. They could fade or dip to colors or maybe even had alfa matte superimpose that could be changed in color. However mixing two video sources required having a sync signal from the vtr or camera or they digitalized the imput feed like the Panasonic WJ-MX series. Synchronizing both frequencies was what made these things so extremely expensive at the time. I remember the cheapest consumer video mixer that could dissolve between two cameras to cost over 2500€. However i really like watching your videos about old tv equipment!
This is the most comprehensive video mixer video on RUclips. I just got a Sony MCS-8M you explained everything I needed to know about this subject. This mixer actually has an 8 track audio mixer.
You make the most wholesome and fun videos thank you! This takes me back to local tv shows about tech way back in the 90s. But also with the 'Art Attack' vibe. You're great CRD.
Your explanation about being interested in various professional tech is spot on. I worked with a lot of 80s and 90s-era broadcast video equipment in high school...including I think at least three of the switchers you show.
It’s really interesting seeing the similarities between stuff like this and the Crestron AV equipment I’m installing at work. It’s wild. Also, I explained to one of my friends that my hobbies include learning about other people’s careers.
I'm from Grass Valley where Grass Valley Group mixers were produced and made. I currently work in a business that has a museum of their equipment and instantly recognized some examples. I'm extremely interested in this type of equipment and it's awesome to see someone cover this.
Enter the lowly timebase synchroniser and corrector, Genlock, and Sc/H colour frame lock discussion!! Not to mention non-composite and composite sync, blanking, subcarrier, video matching and DA equalisation. Long live the green tweakers!
I like watching people who are passionate about things explain stuff- even if I never really wanted to know its just cool how much knowledge they have about one specific thing
The wipes are still there and mostly used (German TD [here we are called BiMi, Bild Mischer meaning picture mixer]) to transition thru a stinger that does not completely occlud the frame. So if we have a stinger that is max ocluding 30% of the image the desk transitions (automated) thru it
Your little rant on Vmix reminded me about one my favourite lines at work which also is similar to something else you slipped in earlier but uh “all my homies hate WIDEORBIT!”
I am a TD as well. I’ve been doing it for almost 15 years on « big brother » style shows, which makes me one - if not the most - of the maybe 10 experienced guy at this now in France. I also started a few years ago to do live streaming (mostly political because I am ;) and I acquired my gear on this occasion (BMD is my brand of choice). I enjoyed your video a lot. It’s a good one to explain those kinda frightening control surfaces to people unfamiliar with it. It’s actually quite simple. To me the trick is more in the doing : which angle and why at every point is what actually makes my job exciting. This and playing with live characters who do not cooperate, as they’re living their life and it’s my job to adapt to it : this is what my core job is and I love it. I think I don’t totally stink at it too, cause I’m still here doing it ! I think to answer your question « why those transitions? » that it’s there because it’s better to have it and not use it, than to not have it and need it. Parts of those transitions I’m sure are using the same systems used for effect commonly used, so why not put all the buttons there just in case.
As a kid, I was given full control of one of those Panasonic units for weekly church programs and irregular public access TV work. One of the larger churches in my area had a direct feed to the CATV headend (which actually served two competing cable networks) and I’d use their gear to do live shows. We quite regularly would film multiple cameras offsite and then playback for a live mix. It was grand fun.
That Panasonic WJ4600 is the one I learned on at a cable access station back in the late '80s. Brings back memories of a lot of crazy fun low budget productions!
Great video man, I can tell you and your friends are having a good time producing these, this was funnier than I was expecting and loved it. Also really appreciate the references to older videos about "how things work", like the Nimoy one. Keep hoping you're teasing the HVX for a video coming up on it. It is truly an amazing camera. Thanks again
I cut my teeth using a Panasonic WV-5600, a double T-bar bus version of your 4600. It had a CUT button, and was designed to be inlayed into a table top despite being a standard rackmount device. It had some silly analog effects and chroma keying , but was a pretty straight forward SEG for running a 3 camera production using WV-5100 studio cameras. It was also the HEART of a the TV studio, providing the master genlock sync to every device there from the cameras to the TBCs for the VTR output. I have also used the WJ-MX50. It was mostly marketed by Panasonic for use in A-B roll VTR editing systems. The setup we had was 2x AG-DS545 SVHS feeders, one AG-DS555 recorder, a AG-A750 editing controller, and a WJ-MX50 connected via RS-422. Through the magic of buying more feeders and timecode, you could record all your camera feeds to tape and later edit it like a live production.
TD here. I use the manual T-bar mostly during artistic programs where the feel of the timing or pacing of the program and transitions is highly subjective. For example I'll use it during concerts to "feel" out the speed of dissolves/fades live on the fly in sync with the musicians. E.g. The camera zooms in slowly as the violinist draws out that last slow sensual note and we fade in time to the wide shot where everything "lands" in harmony and it just "feels" "right". My favorite moments of TV magic are when the all the parts, including time (which is what the T-Bar is for) come together to make something special.
Spot on! Consistent, evenly timed transitions aren't always desired, and the T-bar is the perfect intuitive interface for the task.
Yep, there's a lot of stuff that's used for things like music concerts which has different needs than regular broadcast TV
I had to do this in college and being live is a lot of fucking pressure.
Good TDs are essential for big broadcasts, and this adaptability is a big part of it -- understanding the content and intentions not just "pushing buttons in order"
yeahh from experience the davinci hes using has the worst, crunchiest, most frictiony t bar ive ever felt, worse than those cheap ones that use audio faders
Re: "who uses the cheezy transitions?"
I used to do technical direction for live events, rock concerts, churches, etc., and the answer is churches. Churches can't get enough cheezy transitions!
And I would imagine the vast majority of switchers ever purchased were for churches. They are also used a lot for "image magnification" aka "imag" at live events. Probably the funnest work I ever did in live video was calling shots on those things, it's like playing a video game and you get to wear a big headset and play with the jet fighter controller t-bar!
HAHAHA
When the community TV station I work at started in 1993, our first switcher was home-made. It was just a button that mechanically switched between the two consumer VHS camcorders that served as our first studio cameras. It was dubbed the Bang Box, and when you look at our earliest shows, the picture would roll every time you switched to the other camera because of the lack of sync.
aaaaaaaaaaaaaaaaaaaa
would be interesting to get example footage of that happening as a contrast for whenever he makes a video explaining genlock
a bang box! that's what we use to call a raw non-vertical interval video switcher too! Some times all you need is state of the budget, and not state of the art, to get the job done! :)
@@CathodeRayDude Other things I'll add:
I've never heard the term hot-punching before, and I find it funny that it's considered taboo sometimes. We use both methods depending on what we're doing. If we're taping a talking head show with three cameras, we'll hot-punch because you can just keep three fingers on the Program row and it's easy. If we're doing something like Bingo that uses the Key inputs on the switcher to overlay cameras on top of the bingo graphics, you can't hot-punch because it will change the bingo graphics to a camera that has other cameras keyed on top of it, and it's chaos. Ask how I know 🙃
As far as transitions go, the only thing we've ever used besides a dissolve or a wipe is during our live Christmas telethon, we'll use a media wipe with pretty snowflakes to transition between the studio and the pre-recorded clips of kids singing.
The only thing I've used the T-bar for was to do really slow dissolves when we had a musician in the studio giving a performance. That's how you also do that thing where you partially dissolve between a wide shot of the musician and a close-up side view of their face.
On a good switcher, the T-bar is dampened with what I can only assume is really thick grease, so it's very easy to get smooth transitions using it.
I've never heard the term hot-punching before either...
As a broadcast TD; your first example is pretty spot on. I’m always worried people will figure out how simple it is to do at its core 😅 I explain what I do to people as “I use a really expensive keyboard” or “I push buttons”
@@strapsgamingvids TBH, unless you’re in a top 10/25 market you’d probably make more working at Starbucks 😉 I was market 100 or so and baristas were earning more than I was directing/TD-ing at a broadcast news station.
@@beardsplaining see even in shit like this nerds end up getting fucked over :(
I'm sorry to hear that dude I hope things improve and at least its steady and stable with some perks.
@@xLilJamie I work in public broadcast now. Infinitely better than commercial stations in so many ways, especially pay and benefits 😉 Broadcasting is the career I picked up as I neared 30 and got burned out on physical labor. Been in the industry almost 10 years now and make a decent living as a Director/Videographer/Editor
I think thats the core of a lot of trades and skilled service. A network engineer is just plugging cables into holes of the right shape and choosing the right color. Auto mechanics just take stuff off and put other stuff on. Plumbers just install and cut pipe. Television directors just cut between cameras. Technically it's all true, but you can rest assured that sticky any layperson in there would result in them being utterly confused about 30 seconds in as soon as a second task in the job needs to be done.
Glad to see the mixer I saved from e-waste and sent to you will live on forever in this video! I did a bunch of shows on this unit and you found features on it that I didn't know existed (the dip wipe!) but you're right - we never used the fancy DVEs, it was mostly cuts / dissolves, downstream keys and chromakey. I'll point out that the grass valley switches have the program bus at the bottom, and echolab had the preview at the bottom - for someone that would do weekly shows at 2-3 different studios I had to always remember to switch - or else you might cut live at the wrong moment! (and that did happen!). Great video!
Can we take a moment to appreciate the sheer number of cameras in the same room all of which he's managed to hide from each other for this production? I think I've counted 2 HD angles and either 2 or 3 SD angles.
oh man the room was VERY crowded. I didn't take a picture once it was all set up, but you're absolutely on point that it was hard to squeeze it all in. I had two Blackmagic pocket cinema 6Ks running, one on a big tripod with a teleprompter, one on a little manfrotto video tripod, but both trailing cables across the room; then I had another big tripod with a Sony shoulder-mount ENG camera, and a second Panasonic ENG camera sitting on a pile of boxes on top of a rolling cart on the other side. A huge cable snake down the center of the floor to get video in and out of the mixer rack module in the other room, plus the boom mic stand and cabling, and three LED floods with softboxes. Me and my gf (camera op) had to climb over cables and squeeze around the back of the presentation desk every single time we had to move anything. I am INCREDIBLY lucky that my viewers make it possible to afford a studio space to do this in; it would have been literally impossible to shoot this video at home, it just requires too much gear!
@@CathodeRayDude And it makes me sad to see that the vast majority of viewers (not only here) won't ever be aware of the insane efforts required to produce something that most of them would regard as "normal low effort crap". On the other hand, you tend to attract the kind of audience that IS aware of this, which is part of why we're coming back for every second of your wonderfully nerdy and in-depth stuff, even if it's an off the cuffs side note. You care, and it damn well shows. And we love it!
@@CathodeRayDude you truly are the TV production version of a one-man-band!
Hopefully he doesn't reuse. I have been using MEVOs which means I can cut to one camera and adjust another and come back and it looks like I have 4-5 cameras.
I don't understand. Why did it require multiple cameras and all that to film himself behind a desk? Honest question because I don't understand what was going on.
I really enjoyed this video. Having worked in broadcast TV news for 10 years, and now running my own production company that specializes in on-location, live, multi-camera production, I use video switchers on a regular basis. So, seeing the collection of video mixers you had gathered here was a lot of fun for me.
There are a few things I think I might be able to clear up. First, on the topic of A/B style switchers (SFX generators) versus Program/Preview bus switchers- A/B switchers, like your Panasonic switcher in this video, were mainly used in video editing applications, where program/preview style switchers were/are for use in live multi-camera production.
Before computer based video editing was really a thing, video was edited tape-to-tape. The simplest form of this was having a playback deck and ar record deck. You would have an edit controller, or controls on the tape decks themselves, to mark in and out points and record just those selected sections to your program tape. However, if you wanted to edit something that was more complicated, and had more than cuts only editing (an entire TV show for example) you would have multiple playback decks and your record or program deck. A special effects generator, also known as an A/B style video switcher, would connect all the source decks to your program deck. You would use this along with an edit controller, that selected your various in and out points and controlled tape transport, to actually edit the program, with your video mixer performing the various dissolves, or keying in things like titles.
This is why the Panasonic video mixer has built in audio capabilities, for four audio tracks, the standard number of tracks that were supported on professional Betacam tape. This is also why you have the program select button, you could select to send the program output to a third playback deck, in case you needed to dub scenes from one playback tape onto another, to allow you to perform an edit where you dissolve between them.
Typically, the edit controller would also have a preview button, allowing you to preview an edit and rehearse a dissolve or affect before you hit the edit key and committed it to tape. So a preview bus output on the mixer wasn't needed. The reason why you had an A/B bus, instead of a program/preview bus, is that this kind of editing came from film editing, where you would have an A-roll and a B-roll. These kind of video mixers were also known as special effects generators because that was their intended purpose, to insert special effects in editing, not to be used for live television production.
In an online edit suite, where you would edit a full and complete TV program, you would have multiple playback decks, a master record or program deck, an edit controller, a character generator, a special effects generator (or A/B switcher), and a basic audio mixer if the switcher didn't already contain one.
The second thing I think I could shed some light on is why a TD would use a T-bar for a transition instead of the auto button-
There are two answers to this- First, sometimes you just want to. There isn't a reason to use the T-bar instead of pushing the auto button, but when you've been doing a 4 hour long broadcast morning show, using the T-bar every now and then helps mix up the job a bit so you don't fall asleep. However, the bigger reason is something I do frequently in my work, which is during live coverage of musical performances. Sometimes during a slower song it is appropriate to do an extra slow dissolve, while other times a dissolve is appropriate, but you want it to be a lot faster.
Next, I'd like to address how and when all those DVE transitions and wipes are actually used and what they are there for. It is true, on a basic 1 M/E switcher they don't have much use. However, when you are switching a program with multiple M/Es you can use those DVEs to build various scenes, combined with keyed in graphics on the M/E you could build things like boxes, where you have multiple reporters on location and your anchors in the studio for example. In these cases, the T-bar on these M/Es might be left in a partial position. But you can get these elements setup on the M/Es before the show, or save them as a macro, depending on the switcher you are using, and then during the program bring up that scene you built in the M/E as a source. The DVE's aren't really for doing transitions, but for building these kind of scenes or elements.
Thank you so much for all of the info, I will integrate this in future presentations!
I was going to say that, that was lot of sense to use a T bar in a musical due the variable timing in it
"film editing, where you would have an A-roll and a B-roll"
Wait, is this where the term "B-roll" referring to some unrelated shot you'd cut to when you want to hide an edit came from?
@@scorinth Yes
@@scorinth Basically, yes. Before video tape, TV news stories were shot and edited on actual film. Back in the day it would be normal to have a news anchor give a quick brief of a story, the promise, "Film at 11:00."
For quickly editing these news stories, you would have your A roll and B roll. Your A roll would have your package's sound track. You would edit your A roll first for sound. Then, you would splice in your B roll material.
When the movie was made to videotape, these terms remained. Even though the tech was now different, you would still edit a news package for audio first, then go in and insert your "b-roll" material, even though your B roll was now a tape instead of an actual film roll.
For someone who's never used one in anger, man you did a good job, you hit just about every point I was whispering at the screen
I'm a Technical Director for a news station. you got the basics! DVE can be super useful. You can place multiple cameras in PiP (picture in picture), then make another scene with those PiP windows in different places. Store them as memories and when recalling them, the mixer will interpolate/animate the movement of the PiP windows. I use that for taking either the outdoor studio camera or weather map, gives the PiP windows a neat zoom in transition. Our lower 3rd and full screen gfx inputs are still called CG, but they're fed by a PC playing back transparent pro res files.
What in gods name was that audacity skin? I like it!
Turns out, that's just built in! Look in the options under Interface, select High Contrast mode.
PCs are great for graphics -- easier to design and easy to rewrite stuff on the fly -- plus it doesn't directly sit in the pipeline so worst case you lose the graphics but keep the video switching
@@QualityDoggo haha our entire main feed runs through a windows PC for some reason... So much for compartmentalizing lol
One other major massive difference between hardware and software based mixers is latency; how long it takes for a signal going into the mixer to come out the other end.
A software based mixer needs to capture each frame into memory, transfer that frame over into the software, let the software process that image, and then render it back out to a frame buffer where hardware can turn it back into a video signal or an encoder can compress it into a stream. Couple that with most software mixers working with compressed video sources that also take time to compress, move, and uncompress for processing, and the time adds up more. The latency for a camera through a software mixer can be from a considerable fraction of a second to many seconds long.
Thankfully most software mixers handle the audio too, so they also have some chance of keeping it in sync with the video.
The delay in most professional hardware mixers is less than a single line of video end to end when working with synchronized source signals, and typically one frame of working with DVEs. If you need to buffer your incoming signals to synchronize them, that adds up to 1 frame, on average half a frame.
Having a minimal delay of course is important to keeping things lined up with the audio that's being processed elsewhere, where a short consistent delay can be compensated for.
But while a TV or streaming audience won't notice or care about latency, anyone watching on a screen in the same space as the action; like a large video screen at a concert, sports arena, church, or conference hall; is going to notice that delay really quickly.
When feeding cameras to screens, we want the light that went into the camera's lens being shoved back out to the projector as quick as possible. We'll run sync/genlock cables out to our cameras and turn off all the extra frame buffers along the way to try and keep that delay at under a frame (one 30th of a second [~33ms]) all the way through. Which is how long it takes for the sound from the speakers to move about 37 feet, which is a reasonable distance to many of your closest audience members for things to still line up in our brains.
Having in count that streaming services have a quite big delay by default, in the order of 20-30 seconds at least and usually streamers force an artificial delay of minutes to avoid game ghosting or manage chats, it's understandable that software mixers are good enough for that while that latency is bad for live TV.
@@fordesponja that sounds quite bad, and im not sure what service your referring too, twitch often has sub 10s latency(measured in a not very accurate way of chat messages and when they show up on stream, ive seen the latency debug output get as low as 1.5 secs, but im not sure what that’s measuring)
and there where some attempts to get it quicker with the likes of mixer
This is a side effect of high level abstracted and modular thinking of modern software dev. Instead of having specific wiring for a specific effect, you put together this input card, general purpose processor, and output card, and now you can do *anything* to any part of the frame, which is really awesome - except it adds some frames of latency.
Open your phone camera app wave your hand in front of the phone - you can see the screen is delayed. Your phone is capturing a whole camera frame, then the whole frame gets passed through some specialized processor (to save power) that does brightness and colour balance stuff into the main system RAM, then the camera app notices the new frame is ready in RAM and queues it up for the next time the screen refreshes. When the screen refreshes the graphics system puts together all these pending sub-frames (status bar, app, soft button overlay) and now that's ready for the *next* screen refresh.
It's basically put together by the creator of each part thinking about what to do with entire frames and not individual pixels as they come in, so each part of the system adds 1 frame of latency. To be clear, it's totally fine for a lot of applications.
---
There's no theoretical reason a software couldn't process pixel-by-pixel, but it would probably still add a few pixels of latency so genlocking the output with anything else would be tricky, but if it's the final output maybe you don't care about that.
@@fordesponja even for tv latency is not a problem. but for a live show, where the audience see the stage and projection at the same time it matters a lot. i try to not even use any digital image correction on projectors to minimice latency, an the mixer i mostly use has 8 lines of delay (0.07ms @4Kp50)
@@miawgogo competitive Twitch streamers add that artificial delay. I don’t watch Twitch anymore but I’ve seen anywhere from 5-45 seconds in like 2010-2016
41:50
One application for using two _program_ versions is during a concert.
You can have one output going to be tapes, or broadcasted. While another output is for sending a signal to on-stage screens.
This way you can have a wide shot aired, with a close-up of the singer on the screens that are located on the stage.
Having the same image would result in the Droste effect, where the on-stage screens would display the same shot as the wide angle shot. So, the shot repeats itself in the screen... over and over again.
I had no idea that had a name! I always just thought of it as the pixel tunnel lol
First time I've heard that name too, thanks!
We always call it the "infinity tunnel" or "infinity loop"; though sometimes I also call it "video feedback" in technical discussions.
@@hyvahyva The name "Droste Effect" comes from a brand of cacao powder. On the red packaging there is a woman holding a can of cacao powder. And on that can there is woman holding a can...
This is why it is called the "Droste Effect".
But, calling it video-feedback or pixel-tunnel will work. As long as the people in the conversation know what you mean, right?
Many years ago I was operating a camera for a show where the director mistakenly cut to a particularly bad case of the looping screen, their response was just a somehow comedically perfect "whoops". As long as I worked with that company, it was henceforth called the "whoops effect". 🙃
When you said on-stage screens I thought you meant little monitors for the "talent" to watch. Now I realize you mean the giant display for the audience located above the stage.
I took a video production class in middle school in the 90's that did the morning announcements for the school. We had a very cheap A/B switcher and an even crappier CG machine. It is really cool seeing more professional gear. And as immature middle schoolers we used every single type of wipe that the switcher could do, usually using multiple different ones in a single morning.
Yep, by the early 90s only cheaper mixers used A/B
The split screen was used on the news in the UK news outlets to get around the social distancing, by having the set very well designed. So that you could get both presenters sitting side by side and would only look really odd if a coffee cup was in the middle. Also, they used it for a program in the 70s or the 80s to get two quiz teams to appear above or below the other.
The quiz show University Challenge has done split screen since forever. It got so ubiquitous that The Young Ones did a parody using that effect.
The news in the 80s even used those ribbon dissolves which get called ugly in the video, haha
University Challenge! People finding out they don't literally have double decker quiz booths and it's just a vertical cut and getting angry is evergreen
Fun fact: If you were downloading TV shows from the Internet back in the day, they would occasionally come out _before_ the show's airtime. These almost never had the station logo in the bottom corner because that's only inserted, as mentioned in the video, when the program gets played out. Not to get too deep into the weeds, but the station logo is typically inserted as the program is played out to the transmitter. If the logo isn't present on the program, that's called a "clean feed", which is how programs are typically transferred. The fact that the logo isn't seen on the downloaded program means it was captured from a clean feed at some point before the program was played out for broadcast.
Finally, someone else who remembers pirating pre-release shows! I'd get episodes of The Simpsons almost 24 hours before airing, it was great. My friends didn't believe me at first. I think I read that the release group (I.I.R.C. they were called FTV, on a tracker known as donkax) had a C-band style satellite dish pointed at the Fox distribution feed, which was unencrypted at the time. This was circa 2002-2004.
@@pap3rw8 hilarious lack of security on Fox’s part, to everyone else’s benefit! I always preferred the clean downloads for visual reasons, and often wondered why only some came that way. Especially since DVDs often weren’t out yet. Now I know!
Ive always wondered how those notorious torrent groups like EZTV can get that clean feed. People on the inside?
@@BobbyGeneric145 Either that or, as pap3rw8 pointed out, they got it from unencrypted C/KU band feeds (it's also worth mentioning that, even if the feeds were encrypted, sometimes they were encrypted using a system that was plagued with security flaws, and decoding the feed was as simple as inputing a HEX key on the receiver, see the history of Nagravision systems for an example of that)
@@BobbyGeneric145 some of the clean feeds are grabbed from distribution satellites. I had a buddy in the bay area that would grab them that way, he had a dish on a motorized mount that he'd just point to the network satellite he wanted to snag the feed from, and rip it to a .ts file.
A few comments from someone who worked as a vision mixer (which is "european" for TD) around ten years ago:
- At least in my area, the only type of production I worked in where one would use preview for cuts was in news, because of the rapid, scripted cuts you would have to make. Your right hand would be ready on the cut button for the next cut, while the left hand was ready to preview what came after that, so you always had the next two cuts ready. Anything else we were taught to cut like the BBC vision mixer is demonstrating at 23:48, left hand for cuts and right hand for dissolves. Using the pvw row for cuts just adds an extra step that you shouldn't need; a big part of the job is always staying one step ahead so you know which cameras you can cut to before you need them.
- About the wipe patterns and the other "fun" effects, you're right in that they hardly ever get used. Mostly they're used to hide the transition that happens behind a stinger (if it never goes full screen so you can just cut behind it), and even that usually just needs a regular linear wipe. But you never know when someone will have an idea or request for an effect or transition that needs it, so why remove them? There are other things in the mixer that depend on pattern generators anyway, so adding/keeping a few "extra" generators in order to keep the wipe functionality is probably no extra cost at this point.
- ME steps are commonly used. Mainly, as you say, for "submixing" of effects. In most cases, you could in theory do everything on the program row, but it is much simpler and safer to prepare an effect on an ME so you can just cut to the finished effect, especially when you need to load different snapshots or effects during a show. If you build the effect on an ME, you can just quickly store a snapshot of the entire ME and know that it won't affect anything on your PP (program) step when you recall it.
Another benefit of the ME's is that you can have more than one panel/control surface controlling different ME's on the same mixer. For example, on mid-size sports productions, we sometimes use the main control room in the OB truck for the "host" production and the "second production" room for the "national" production. Since they're technically just different ME steps on the same mixer, the national production have access to all the same sources in addition to the host output.
I worked at a small TV station many years ago. We had a Sony video mixer for production and the control surface was rack mounted, so embedded into the table was rack mounts so it could sit recessed in the table. So rack mountable equipment in a table is a thing.
Yeah it's used all the time in sound studios so I'm not surprised at all.
TD here (Blackmagic ATEM Production Studio 4K), a lot modern video mixers have audio. One of the main uses is automation in AFV/VFA (audio follows video/video follows audio). But as you speculated the internal media players use it as well. It also is a great way to set limits, etc.
What the wipe and other transitions were used for was mostly wiping graphics onto the screen. Back in the day, your lower third super generator just output a static image, so to get a nice effect of it sliding or wiping onto the screen, the vision mixer was the thing doing the transition between the normal shot and the same shot with the lower third graphics _superimposed_ on it - hence the term "super".
EDIT: Ok, you kinda got there yourself. But yeah, by the time this box was on the market, it was doing the keying of the graphics all by itself. Before that, it would've come out of the character generator (CG)/Chyron/graphics generator, into a keyed the graphics over the shot of the talking head or whatever, and fed that as a separate input into the vision mixer, which would usually be called the "graphics" or "Chyron" input, depending on how American your facility was. Often the graphics operator would have their own mini vision mixer, so they could overlay the graphics onto inputs separately to the main vision mixer.
I remember the wipe effect being used mostly for commercials. A car dealerships and such liked the fancy wipes. Rarely used in program production.
JD the TD here. Loved your video and great job in explaining a very complicated subject in layman's terms. I am a broadcast TD and use the big giant Sony and Grass Valley Keyenne switchers shown in your video, on a daily basis, but have also used many of the older examples too. My first switcher in church, was the Panasonic WJ-50. The Preview output that you mentioned not being able to find was not to preview a bus, but to preview an effect like a chroma key or downstream key(dsk), etc before taking it. I also remember using a larger echolab switcher in a remote truck in the late 90's. It had a Pgm/Pvw bus, but the two additional M/E's were A/B buses. It was very confusing to have both on the same panel and "bagged" me several times each time I would get called to use it. It's like Echolab was too cheap to make all three M/E's flip flop like Pgm/Pst, but by having it on one, they could be competitive with other brands that had it on their switchers. I hated that switcher!
The most challenging part of running the big broadcast switchers is remembering where things are located in the menus to do what you're trying to accomplish.
Most productions that I have been a part of have a separate video director calling the cameras but there have been the occasional shows where I have done both or "called and punched" my own show. What the general public has to know is that television production is a field where there is very little tolerance for errors. It requires a great amount of focus and mental stamina. It's more that just hot punching two or three cameras, like in your video. It can be more like punching 5-7 cameras on a talk show or game show, and then being expected to record multiple shows, flawlessly, in the same day. Or being on a LIVE to air show like the news for four hours straight or cutting a music show or sporting event with more cameras than you have fingers and being expected to hot cut on the beat. By the end of the show, you walk away and your mind feels like a bowl of jelly. It is not an easy job in the least bit, but if you are good at managing a lot of signals and multitasking, can be very rewarding.
I worked as an interpreter for a news network for about six months and I have never more than maybe ten keys on what was a gargantuan mixer. Audio was strictly a separate affair. The t-bar was frequently used though. They would use it to perform cuts from different sources. Sources would be news agencies. Here's an overview of what went down, or whatever I deduced from observation:
The network I worked at didn't have any news broadcaster or any sort of live feed that is shot within the premises, i.e. the modus operandi of Euronews. Agencies have constant video and audio feeds that the network had subscribed to. There are monitors in the production room that exclusively display one particular agency. The director seems to have notifications come to them from a RSS feed or something like that about the contents of each agency feed. The director gives the general direction but I actually did that in their stead from time to time if you can even believe that. Anyway, the TD would pick a source, preview it with its audio using either a headset or a little loudspeaker but that was just to check whether or not there was proper workable audio. The CG operator prepares the text, something like "man marries his sister in Alabama", wraps the text in whatever graphics and adjusts the graphics as necessary, then feeds it to the mixer. The channel logo, the date, or time seems to be added by the mixer but they are a rather standard affair since they were constant and I imagine they just press a key to display a whole bunch of them at the same time. The TD picks the feed, applies the logos and shit, applies the output of the CG generator which was typically referred to as "KJ" (abbreviation for karakter jeneratörü" in Turkish), then the audio engineer, who happens to be the best AE I have ever seen in my ten years as a simultaneous interpreter and I have worked with so many, listens to the audio feed, does necessary adjustments as he sees fit, then adds my voice on top of it. The t-bar was almost always used in transitions of any kind and I imagine it is to give the TD complete control of how things go down, something I observed to be essential if you want things to go smoothly.
All these were what I could deduce on my own since TDs were not so eager to teach me anything as if I was gunning for their job or something. I am probably mistaken in quite a lot I have said so please correct me if you know better I really wanna know.
Stellar content btw.
I love the Grass Valley Video mixer used in Star Wars as control for the Death Star super Laser. I learned video mixing on such a machine.
Now I'm wondering how many people reach for the T-bar to blow up a planet versus feel the need for speed?
@@AerinRavage The movie Brainstorm had an Ampex video switcher as a security console for controlling alarms and telephones! So hilarious!
@@AerinRavage The T-bar on my switcher occasionally has space laser sound effects.
(Which I make to amuse the crew, or myself. The camera operators know what a cross-fade sounds like on the intercom: "Ready 2 with a fade ... Biuuuuuzt!. 2's up, 3 clear.") 😁
A perfect mixer to mix planet with stardust !
I was going to mention that myself... I saw Star Wars in 1977 but it wasn't until I got my first job as a Broadcast Engineer in 1985 that I found out what that cool looking control board actually was. I looked after three studios all with that same Grass Valley mixer.
I grew up in a TV family. My dad worked for CBS for almost his entire career, my uncle worked in remote trucks doing sports broadcasts, and now both my brother and I work in TV. He is an audio mixer and I am a colorist/online editor. All this is to say, that when I was a kid, every time I watched Star Wars as a kid (which was a lot) and the scene where they fire the Death Star laser happens, my dad would always point out that the controls they're using were actually some Grass Valley video switcher he was very familiar with from years ago. So, there's another use for that T-bar.
One common use for multiple program outputs is also live screens on set or for events with a live in-person audience. For example the big screens for the live audience at a concert are there for close-ups of the musicians, and maybe some graphics or clips playback as part of the "set" and/or lighting design, but they wouldn't ever get the wide-shot cameras that are used to show the TV (/stream) audience the scale of the event.
Skilled Directors or TDs will even synchronize content across outputs to build virtual "layers" through the real world: the wide shot of the concert stage where the crowd is going wild reacting to the guitar solo they and TV are seeing in a close-up shot on the screens that flank the stage... Then as the solo ends, both the TV program and the venue screens cut back to the lead singer together for the next lyric.
Thankfully modern mixers/switchers also have a lot of automation features to assist the operator/s in managing all those outputs.
Hey Gravis - love all your videos. I help TD for our local news station (CBS affiliate) as well as community TV station. You hit the nail on the head! When I'm switching I'll use the auto transition button the most but I like using the T-bar to control the speed of dissolves.. there has been an occasion every so often when I'm using it to stop a wipe halfway for a split-screen, but we don't use any wipe all that often. With news we'll groan over the stingers (too many times) but also some of those cheesy-type picture-in-picture swooshes. MEs are our friends too. Loved seeing all the different varieties you have - if you want some of that experience live-switching for a real show get in touch!, haha
55:37
SD is still suitable for those screens at concerts. It is not needed for a LED screen that far away to be HD.
As a matter of fact, tomorrow I work at a huge concert "Musicshowscotland" in Rotterdam _(The Netherlands)_ with SD LDK cameras on huge LED screens. Works like a charm.
I remember being in the video production club in highschool and using these to produce the morning news at school. We had TV sets that piped our production to every classroom. I was thoroughly surprised to find out that almost nobody else had these in school.
I was also in a video production club in HS, but I graduated in 2020. I'm incredibly jealous because we did it in the most boring way imaginable: Prerecording everything over the course of a week, editing it in Final Cut, then uploading the rendered video on YT 😅
Most schools didn't even have morning news, the students *maybe* did some announcements, but otherwise didn't do anything
In my TV viewing experience, the stingers are most often used when switching between a live feed and a replay in sports content. Like, regular cuts are for changing angles on the same action, and stingers are for... ehm, time travel. Likewise, in news they seem to be used most often when switching to a remote location or a prerecorded segment.
So, even when not done particularly well or seamless, they still serve the purpose of marking the particular cut as a "bigger event" than just a regular one, as a visual cue for the viewers.
I think it's also relevant that you expect people to only half pay attention to the news (and sports, to a lesser degree), and only direct their attention to the parts of the broadcast they care about. A stinger usually gets animated in otherwise unusual, full screen ways, which is good for alerting peripheral vision that something is happening
Synchronizing your finger snaps to the example cuts at around the 17 minute mark was *chef's kiss*. I went back to rewatch that a couple times.
thank you, it is the kind of touch that I try to include
I'm a huge nerd but I ended up going to film school, I used to collect so many cameras and stuff but the dredge of adulting has largely made it difficult for me. Thank you for sharing your collection, I LOVE this stuff. One day I hope I can collect some older camera and editing gear to add to my workflow. Not for like A or B type shots but the real thing is WAY better then any tomfoolery in the NLE.
The reason that old gear was built as a "rackmount" unit was because the workstations in the control rooms were (and to some extent still are) 19" racks. Instead of having a unit that sits on a horizontal desk, the desk itself is sloped and has 19" rack rails built into it, then the gear is mounted directly into the desk. If there's no gear in a section of desk (e.g. the director's workstation), they instead mount a blank panel into the desk.
Or maybe a shelf for snacks
I completely understand why video mixing is so interesting to you, these devices have always fascinated me! I remember when I was young and I was invited to take a look behind the scenes of a big television broadcast station. It was so interesting!
Man, every video you make is so interesting. I can only imagine how much content you have just, sitting around. I'd love to see more videos where you just talk about random appliances and their history. Keep up the good work!
Gravis, that Ampex ADO cost half a million dollars back when it was released, consumed 60amps of 240Volt AC, and contained two 200 Amp five-volt supplies, and two 50 amp plus and minus 12-volt supplies! The mainframe was huge and very difficult to repair... Ask me how I know! :) We had four of them back in the late 80s!
holy crap - that's about what I figured on the price, but I can't imagine having four of them. I figured these would be a single object that was treasured and revered by everyone and you had to schedule time on it, haha.
@@CathodeRayDude We had several Abekas DVEs, video switchers, and DDRs as well. They were mostly analogue component video and parallel digital CCIR-601, before serial digital video! The video was run from frame to frame with high-cost low capacitive (2 to 5 picofarads at 25 metres, the cable cost several dollars per foot!) DB-25 parallel cables, around a max length of 25 to 50 metres!
The Abekas switchers didn't have MEs, but had layers, and you would build the images up layer by layer and then record it off to a 60-second video disk hard drive playback and record digital disc recorder. These were all multiple million-dollar solutions! I had to sign an NDA to never speak of how they were connected together, as that was an in-house production trade secret! serious stuff back in the 80s and 90s!
@@tekvax01 I hope you don't break your NDA by saying that they were connected together by hmmm shhh CABLES!
To add to what @dan b said:
Each ADO unit he described was a *single* DVE channel! These were a separate piece of equipment to the switcher whereas modern switchers offer DVE on every upstream key. Each ADO channel was about 10-12U (?) high of rack space at least, about the size of a bar fridge. You’d get two in a full height rack. The 4ME GVE Kalypso I used later on was smaller than 1 ADO.
They worked just like a Character Generator would with a mixer. The ADO provides both a video output and a key output. The key output is a black and white video channel that the mixer used to dictate the shape of the key. This is known as External Key, you can even do it "manually" with two cameras. Mixers would either have specific key inputs, or inputs programmed to be tied into video and key pairs.
The ADO terminal you showed a picture of at 27:58 @Cathode Ray Dude [CRD], is the Z80 based microcomputer controller as described in the manual. This did the math part of the effect geometry, key framing and file storage. The ADO terminal fed the geometry to the ADO via a serial data connection. The data was real time - move the picture and you could see the numbers change. You can edit your effect by text input and even had copy and paste functions so you weren't doing everything by eye.
Because these things were so expensive, the terminals had a resource sharing system. You could “acquire” and “release” control of a DVE channel depending on how many you need. That way multiple studios or edit suites in a facility could use the ADOs as needed, using centralised video routing or just a patch bay. A TV station I worked at had 3 ADOs, shared between a studio and an edit suite. They normally sat at a 2:1 assignment, but the studio could borrow the third ADO when doing a bigger show.
I helped one of our engineers replace a PSU one day. I remember us joking “200 amps would kill you, but the 5 volts would do it real slow”. They're the size of a small car battery. All linear PSU technology. The internals were amazing electronic design too. The “motherboard” was a backplane built with wire wrapping. Every connection is an individual hand placed wire. All the same colour too.
While the quality of the video the ADO produced was quite dated by the time I used it. The unit as a whole was a joy to use. The terminal had a purpose built key layout and a full screen to display geometry data on. DVEs built into mixers all had to shoe-horn the data display into what's available in the mixer. Large scale units like the Kayene and MVS have a PC running their menu systems, but the DVE data still has to work with the overall layout. Like @Cathode Ray Dude [CRD] says: these things are a conglomerate of user interface and a great study in the possibilities to achieve the same outcomes.
Saw your mention about radio automation software. Even the older stuff can get absurdly complicated there - interacting with GPIO/RS232 to mixers and even ISA codec cards on the ancient stuff. Looking forward to see what your video on that’s like and somewhat hoping I don’t recognise the equipment! 🤣
I’ve got a few videos on the @technicallymindedtv channel if you’re thoroughly bored covering broadcast radio with a UK bent.
The Datavideo SE 500 was considered a high end piece of kit here in the Philippines even up until 2020 before the ATEM Minis took the world by storm. Got to use one in 2013 in high school switching with two consumer CCD cameras and a Macbook converted from 1024x768 VGA to 480i Composite and loved it haha.
It’s fascinating how you can score a full HD video mixer for just $300-500 now thanks to the need of consumer level streaming brought by the pandemic
I used to have an MX50 in a university video studio I worked in. You can get a straight cut between A&B busses by hitting the auto take button on the right of the T-bar, when the transition knob is set to minimum (all the way to the left). I do remember using this mixer with a programme / preview workflow - if I recall correctly, the preview output displays whichever bus isn't currently being sent to the programme output. I can't remember whether the auto take swapped the busses, or just simulated the T-bar switching over - I think the latter, as I have a recollection of the T-Bar's physical position being out of sync with the current programme output - hence the little red LEDs on the left of the T-bar to indicate which bus is live.
Thanks for the trip down memory lane!
I was the TD for several small live video productions at a university during COVID. I operated four Panasonic remote controlled cameras, as well as doing video mixing with a Blackmagic ATEM Television Studio Pro 4K. A bit overkill for four cameras but I'm not complaining. There were I believe six other inputs, but I never used them as they were video sources from computers and unplugged cables - I swear one went to a security camera backstage.
That might well have been a security camera. I could see one benefit being so you can tell who's gonna run out from behind stage. Or what mischief might be going on. Plus a security camera is way cheaper than a professional camera.
I work at broadcast events for eSports across the globe, especially for CSGO and fighting games. We still use a traditional video mixer, generally a large blackmagic one just because we have one, but every passing day we are automating our broadcasting, whether it's hooking to an API to get game information and switching automatically or synchronising AR with cameras, lights, pyrotechnics and the like. We most generally use vMix nowadays. A LOT of vMix. Like, ~5 instances of it at a time interconnected at different parts of the venue, to create one cohesive broadcast experience. It feels like the general trend in gaming broadcasting is going towards having one "prod table" that takes care of EVERYTHING and connects directly to vmix to switch things around. Especially in smaller, one game one stage broadcasts we are completely skipping over the video mixing at times and connecting all cameras directly to a Decklink and using the XL streamdecks with Bitfocus Companion to use as a glorified mixing board.
God I'd give anything to see a full video on that Ultimatte model you have, that thing looks more like some vintage test equipment and I'd love to see the quirks of pre-PC chroma keying
It is apparently an INCREDIBLY advanced piece of gear. So advanced that it's either broken or I don't know how to use it. But trust me, if/when I get it working, it'll get a video.
@@CathodeRayDude i hung on to my cheat card for the ultimatte iv for the longest time through several moves - I’ll see if I can find it. It is really hard to get it going just by fooling with the front panel till you close in on what you want. I got myself in real trouble once just thinking I could wing it first time out.
later, mid 80s they came out with the ultimate 300/news, Matt, which was smaller, and only had a few knobs it much easier to just do the basic weather map effect while retaining shadows.
@@CathodeRayDude I found it! The quick setup guide for
The ultimatte IV and the troubleshooting guide on the other side. I even have two copies so I can just give you one if you want the original one on cardstock.
I sent a scan to your gmail
About 10 years ago, I got the wild idea that I wanted to "run an analog TV station" as an art project for an annual event. While I researched all the Broadcast (AKA radio) details, I scoured eBay and local auctions for anything super old (eg: affordable), but never found anything. I also needed at least one other person to help with everything, which I also came up empty on finding. Years pass, and it never ends up happening.
FF to today, and watching this video has reinvigorated my desire to make it happen. Time to start my searches again! Thanks for the info and the spark needed to reflame an old idea!
I'm learning VJing and modern video mixing in software, so this older equipment is a really fascinating insight into the history of effects systems and their UI design choices. I can see some interesting parallels like the large main lever on the right side of most units, which gets an equivalent analog lever on almost every recommended input system today going into a PC
I found a game recently on Steam called "Not for Broadcast" it's about running a video mixer for a news show, making sure you keep the camera on who's talking, bleeping out cuss words, etc. there's a free demo which is how I played it.
45:43
I am a professional camera operator for too many years. And I must say that those camera, with the million buttons and switches, are easier to operate than the consumer ones. Where you have to dig through menus to get things done.
Of course, nowadays there are a lot of menus in modern cameras. But to actually operate a camera is easier with dedicated buttons and switches.
That is true, but it adds to the cost. Also I suspect all those moving parts are more opportunities for failure (e.g. from dust and perspiration getting in).
We use DVEs mainly to set two seperate video feeds in a PiP-style setting, mainly controlled by the T-Bar for coarse positioning and then leave it there as we take the shot. Works great with artsy shots for jazz solos with a wide and a close up or even fancy music related live content mixed with the actual musicians, e.g. a circle of weird colors from a seperate feed off the LED wall and a wide shot of the same LED wall but with the band infront of it. But I mean... that were 2 out of 150 productions of last season, so DVEs are mostly just for kicks and to please creative enjoyment of our directors :D
If you are talking about analog composite video. If the two signals are synced, you can actually mix them with simple summing. That is how that old rack mount mixer you have there works. It's not doing any frame buffering or anything, it just switches video and will do summing if you do a fade.
But that requires not just Genlock, but also synchronized phases of the color carriers? And, of course, it only works for NTSC and PAL ... which is why even in France, most studios used to work with PAL or component video internally and only converted to SECAM at the very, very end of the pipeline.
Short answer: no, you can't do simple summing.
The longer answer is, if you have two perfectly synchronised monochrome video signals (which requires your sources to support genlock) and you sum only the video portion of those signals (not the sync within those signals) and divide the output by two, that would work. As soon as you want colour your signals must be RGB with each channel summed separately. But for composite colour video such as NTSC the colour is encoded in the phase of the luminance signal at certain frequencies. You could try synchronising the 3.57954 colour burst, and good luck with that when a quarter nanosecond difference will produce a major colour shift, but even if you could do that when you mix you'll see huge colour shifts anyway.
I was the lead producer on my middle school's TV station from 2018-2020 and we used a Panasonic WJ-MX50 every day - we were a 4:3, SD production house, using three Panasonic Mini DV camcorders hooked up via composite to the switcher, and to three small B&W preview monitors, with the program display output going into the JVC version of the CBM 1084 CRT monitor. For some reason, we also used an external audio mixer.
We never got to add the character generator, but I would've loved to and want to get my hands on my own. I do understand your issues with A/B but that's just always how we did it and it always worked for us (with only three cameras, you could always keep your eye on all three - the morning news show had camera for boy anchor, camera for girl anchor, camera for both anchors, plus a fourth VTR input).
Back when I was pulling cables for ESPN it always boggled my mind that the tech director was (on some level) tracking a dozen cameras. They had assistants to help with comms and spotting and everything else but still ! A dozen dang video feeds and one person is deciding what millions of people see
A few years back I learned what was involved with audio for sports ball games. same thing one person is literally listening to multiple microphones. And I do audio mixing for bands and I feel like I can't comprehend it. And then you throw surroundsound in.
I think the grand-daddy of all video effects units came from a British company called Quantel. They had products with names like “Paintbox” and “Harry” (?) with six-figure prices, that could play around with video, in real time, in quite mind-boggling ways. If you look at old Kenny Everett TV shows, I think a lot of the mad stuff he did there was done with a Quantel machine.
Then there was an earlier product called “Scanimate”. This did all its amazing video warping and shattering and stuff with entirely analog circuits. For a brief time, they were apparently renting this thing out to customers (mainly producers of TV adverts) for $10,000 a day. I think there is one unit left still functioning, in the garage of a former engineer of the company, who has a RUclips channel where he shows it in action.
This answered so many curiosities I had about these devices! Great work!
This is clearly the best introduction on YT into live mixing for people with computer experience but no AV knowledge so far.
WOW! major flashback dreams about that Panasonic switcher in the front row!
That was one of the first switchers I used in a mobile production van! They were tanks, and almost never let you down!
They didn't do much, but what they did, was awesome and very reliable! Loved the unlockable T-bar Mix effects / wipe lever! Funtimes!
I did public access back in the late 80s through early 90s - our video switcher board had all sorts of features, but was still only capable of doing 480i.
I did the whole producer training, and helped produce many episodes of a local zoo show, as well as a lot of sports events at local high schools.
We even had a couple of Video Toasters... but almost all of our studio cams still used photomultiplier tubes (CCD was out but was really expensive then) so good luck doing anything fancy. :D
I was involved in community TV and radio on the 90s. It was so much fun.
Since people seem to find this interesting I'll call out some of the gear we had in each of the studios (two large ones and two small ones).
We also have several mobile truck studios which had similar but more condensed sets of more or less the same. Almost all of this stuff was rack mounted in a series of panels behind the switch board area (but within reach) and along the side wall.
- the main video switcher - much larger than the ones shown in this video - ours had at least a dozen wipes/fades/etc and several modes to insert an inset (used way back in the day to show a scoreboard of a sports event in the corner of a live shot). Pretty nifty for the era, although far from top notch for the era (we were a public access station with decent stuff, not a major network affiliate).
To run this, you needed to demonstrate mastery - this person has the greatest ability to make a goof, especially if you're doing live (for sports events, we often did).
- VTR units - i.e. VCRs but of the U-matic 3/4" tape format - you'd have one unit set as master record, and several others where you could queue up various B-roll clips or the on-site shot you pre-edited; have your host introduce the video then cut to that tape, back to host once it's done. repeat, etc.
- sound loop players - little short loops of audio in nice little tape modules - insert a pre-recorded audio advertisement, etc. Push the button and it immediately began playback - once done it would stop and be queued at the right spot for the next time - pretty neat for back then.
- SFX/audio mixer board - same idea as the video switcher except lots of channel sliders; similar to a band's touring mix board but only a dozen channels or so. Nothing fancy like recording studio board (those have dozens/hundreds of channels).
- master signal generator / genlock - a rack mounted master clock - it sent a signal to all the video gear and cameras to keep them in sync. A requirement for the old NTSC standard to avoid any flicker.
- light mixer board - control the various lights in the studios - this was pretty fancy as we had a variety of plain roof lights, smoky gels, indirect, etc. Sadly not configured to use that protocol where you can send signals in real time, so somebody had to run this when you were doing a host intro and fade out at end of episode for the studio stuff.
- studio cameras - generally three huge tripod mounted monsters - they did have a limited ability to move so we would set them up for a show then leave them in place. Wonderful setup - preview display on top with red light if master was using your feed (so you'd be careful not to do anything dumb), connection for headset so you could whisper to control room and hear their instructions - "camera 2 get a close up of guest", "camera 1 get wide shot" etc. while cam3's feed was being used.
They had the nice hand grips and were very well balanced, so it was easy to aim it; zoom was a thumb control on one hand and focus (haha you expected auto-focus?) was a twist on the other hand.
If you aimed at the lights for more than a split second you would ruin the tube, which cost at least a few hundred bucks to replace, so just don't do it. :D
- a whole row full of preview TVs - one for each of the VTR units, one for "master out" and several others.
- shoulder cameras - we had a few smaller ones more along the size of news gathering gear - but that took U-matic tapes, so were bigger than a VHS camcorder. Once again, don't point at the lights - although these were a bit more tolerant than the studio cams.
- giant room full of audio cables and reels of video cables - audio was normally done via same stuff professional audio people use, nice rigid cables with XLR connectors that actually clicked and stayed connected :P Video was done via these at least 1/2" across cables containing 40+ conductors - the end connectors plugged in and you twisted to lock it in place.
- Going out with a truck to do a sports event was similar but often with less gear - 2 or 3 cameras for volleyball - maybe up to 5 for homecoming football. The "fun" part about those events was needing to do setup and teardown - and making sure to place your precious cable bundles somewhere they won't be constantly walked on by people who just can't seem to be bothered to step OVER the obviously very expensive cable. I wish I was kidding - those video cables were like a magnet for "moms with high heels walking out of bounds because they think they can" and we'd have to shoo them away. :D This was the point in my life when I learned about rigger's/gaffer's tape - like duct tape but much much better (even if more expensive).
@@chouseification it's like what they say about fiber optics, The yellow digging machines will come for it. I did audio mixing for a band at a race (running, fundraiser, nonprofit) wouldn't you know it of all the places the pathway decide to form right where all my cables were going from the Mic's to the mixer and I didn't have a snake or longer cables at that time.
@@imark7777777 yeah that's how it always works... and if you'd known, you could bring those big rubber guard strips (heavy monsters that they are), but venue often claims "clear safe path" which means "only family members have access" but they're nosy and step on things, or worse it's near some VIP area and these folks think their access to the fancy bar somehow allows them to destroy your gear. Yikes.
As to fiber optics, I was working at an ISP around Y2K and during some digging work, a major fiber between POP sites was cut near my metro area - luckily we didn't drop completely due to several other links but the traceroute was odd until that link came back up (some poor tech had to try to splice the old fiber or maybe they had to run it again between nearest transceivers). :P
One of the oddest shoots ever was for a really awesome cover band called The Dweebs from Wisconsin. They're really talented, and can do a whole lot of songs. We drove way the hell to Lacrosse to film them do an Oktoberfest show - excellent show; great sightlines, we had a direct tap into the audio guy's board, only problem was two of the three cameras (to be able to film over the large crowd) were in the balcony... a wooden balcony built by who knows, that was bouncing every time these way-too-drunk (it is Oktoberfest) drunk peeps who were some special club members for the festival, who thought that got them the right to be there attempted to "dance".
Sure if they stood mostly still and silent no problem, but they were bouncing around like a bunch of fools, and I wasn't really feeling happy about the structural engineering of the wooden balcony. It took some fetching of organizers to get them to shuffle off to somewhere they could annoy other drunk people, but eventually we got the loft back (we let a few chill folks remain with us though, no need to be mean).
The camera down on the floor was doing a variety of crowd shots and band closeups (we knew when specific folks would have solos coming up, etc) but that was just a pimp daddy camcorder securely strapped to whoever's turn it was to do that rig. :P
@@chouseification yeah wasn't that fancy for the rubber cable guards at that time as I was literally 5 to 10 feet away from the band.
Although I did do the music tent for our small local county fair (minus the animals so we call it an expo). we had a fiddle contest where all three judges decided that they were going to move into my tent and act like they own the place. And then they complained about being behind everybody and not really able to see the players, shockers it's almost like they were sitting behind the stage. Oh wait they were to the left.
I do an international water tasting for 3 years now and I love offering up press feeds. And occasionally will have other groups come in and film like TV, documentary etc. I've only had one person who understood what that meant where I didn't have to explain and they took me up on it.
I've also edited videos where having a direct feed from the board is so much better than having the room audio from a gymnotorum.
Oh yes the fun situations that we find ourselves in. I got into a concert at a nearby local theater free for helping them tear down. Not only did half if not all of the people coming in have earmuffs on and hearing protection which made the entire thing a horrible experience for me the whole thing sounded better in the lobby over the fallback system. Oh yeah where was I going with this the old theater balcony needless to say started shaking with the spectators even made one person nauseous. It's nice that you got them kicked off and it's also nice to keep the responsible ones. Rickety structures ekkk.
I was asked to film today or yesterday now at my local Art's Place, they had a kids art thing they did in their black box theater which has four posts and guess where I was I just barely managed to get angles and on top of that they had a band set up and a completely other weird angle that I had to jump back and forth between with one camera. I did it free as a favor so didn't necessarily feel like nor have the time to grab the second camera. Although I did get a compliment from the new Director, on how unobtrusive and how I got a decent angle and wanting to work with me again. The funny thing is under the previous Director I used to volunteer at the digital media center that fell apart consisting of a 3 cameras Community TV studio.
There's a really small town near where I live and every so often they lose phone+911 service because the line somehow breaks between here and there. Lucky you still had service from what I understand a lot of lines get consolidated down to the point where they might've been redundant on either side of the street but after consolidation from some penny pincher are now running through the same cable.
I love this video! I've worked in video production for 10 years and its neat to see someone so interested in the most mundane part of my job. Some little notes I have about my personal experiences and what you talk about in the video.( Disclaimer I did not go to school for any of this, I learned as I went. I got a job on a video crew and 10 years later its now my career, so I might just be some yokel and I’ve been doing everything wrong this whole time)
1. Its funny that you mention running a broadcast with a dinky mechanical switcher but I have actually done that once. I had a baseball game where a foul ball struck my tricaster 400 and it would not turn back on. We used a 4 way SDI switcher to run the rest of the game and cut back to a camera pointed at the scoreboard every play so the viewers could keep up.
2. When it comes to transitions we also only use cuts or fades (or whatever branded stinger ESPN gives us(also we don’t call stingers transitions we call them “video cuts” not sure if that is normal or not)). My understanding is that the goofy transitions are largely used by churches, who have lots of money for the expensive switches but are not really run by artistically minded professionals. I have done lots of “live editing” using a mixer and never once used a transition besides a cut or fade.
3. We do the same preview/program switching for sports but usually leave one teams coach on the preview so I can grab the reaction to a bad play.
4. On the topic of sound mixing, it is by far the best job in the industry because you get a room all to yourself and nobody bothers you as long as everything works. You also get to hear all the gossip reporters chat about during the commercial break.
I miss my sound booth days.
5. The T-bar is good for emotional interviews when you want to really “feel” the transition. I don’t use it often but many people I know use it exclusively, it might just be when you got into the business. I love the cut button.
6.When you mention big sporting events and how complicated they can be. Usually there is one or more co-piloting producers there to watch the feeds and help pick shots. Along with multiple replay operators it helps to not overwhelm the director with each individual camera feed.
Sorry that was so long.
Okay, so yeah, the T Bar tends to be for when you want to do a variable fade on the fly. The auto trans tends to be your best friend as a TD. With regards to DVE transitions... as a local news TD, I typically only use 2, all the others tend to be "cheesy" imo. The two I do use pretty regularly are a lens flare effect (almost like 2 stars flying across the screen with a lens flare in the middle) and a page turn. I use the lens flare for lighter stroies, or for certain fullscreen graphics (like "If you or a loved one is in need of help, call this number"). I use the page turn to transition between multiple graphics in the same story. I guess I also commonly use a white flash, but I don't think that technically comes from a DVE since it's just a color being generated by the switcher.
Anywho, like I said in my other comment... these things tend to go through automation in local TV news. So we have some central servers that feed commands to the switcher to get it to behave a certain way. It's really interesting how many machines work together to get a production going. For example, that loud thing you had, we refer to it as the frame. It holds ALL the technology for the switching, as well as several other things (oftentimes it has a multiviewer that allows you to view all your sources that are plugged into the switcher). The actual board with all the faders and buttons is mostly just something that allows you to manually interface with the frame, and isn't even required on many modern switcher setups (due to the automation aspect of things). These switchers also have the ability to send commands to your video server, graphics server, etc. (usually referred to as custom controls or macros). This allows you to pre-save some of the effects or transitions without having to manually recreate it every time. There's a lot I haven't even touched on, but the technical nitty gritty is what has kept me in the industry because the technology that goes into it is fascinating to learn and become more of an expert in.
Thanks for the video! Even from my knowledge and perspective, it was really insightful to see from a more general technological pov. Your videos are truly fun to watch.
Thank you so much for enjoying the video and for the info. I've heard a lot about how modern productions, news in particular, have almost no human beings involved during the show other than the talent, and it sounds really depressing compared to the chaos of yesteryear. I would really love to have been involved 15-20 years ago.
@@CathodeRayDude yeah, they've whittled it down in most newsrooms, taking out the TD (with a Director only), graphics op, etc. Some have even gone as far as getting rid of the camera op, audio op, and teleprompter. Luckily, the newsroom I work in is led by a guy who wants just the right amount of automation because he'd rather have human errors than technology breaking constantly.
Interesting I have not heard of the term frame being used. It is fully understandable and I suspect it is some thing out of the telephone industry since a telephone switch is usually put together in frames which is also where we get the standard 19 inch rack spacing. It's interesting how the technology is overlapped and the terminologies drifted into other use cases.
I was thinking, those transition effects look useful for livening up changes between static graphics, rather than separate moving video sequences.
@@brycejprince As a TD working in a (mostly) fully staffed non automated studio, I hope they never change. Troubleshooting a sleepy cameraman is easy "Why did you have camera 1 and 2 switched at the start of the show" "Sorry I read the shots on my rundown backwards" "Understandable, our show airs very early in the morning and waking up takes time, just try to pay closer attention tomorrow, ok?" Versus potentially hours of fiddling with stuff and trying to reproduce an issue. We had one of our ME busses go 180 degrees out of phase on the colors yesterday (red became blue and blue became red) and engineering is still string to troubleshoot the issue over 24 hours later.....
Mixing console chassis were custom made furniture for these rack mounted devices. These racks were set up at angles to allow usability as the button farm grew. This was also when you'd build your own component setup. "I want Panasonic switching, Crown Amps, Sony Effects", that kind of thing. It's akin to guitarists building their own rack effects/amp combos back in the day, even pedal board combinations have now given way to amplifier sims. I was a pro guitarist back when all you needed was great hair, we used to call them spice racks. But back to the video side of the world, the days of custom builds like that are over. Modern products have everything built in, even the slope up/stacking in the chassis for the button farm. :)
Your Panasonic there is missing its ears (mounting brackets)
Gravis, on modern video switchers the automation systems such as overdrive are fed from the newsroom automation system, for example, Avid iNEWS, and all of the customs, video frame buffers and video playback, CGs, teleprompters, audio boards, routers, under monitor displays, etc are all sent in real-time to the switcher and everything is under automation. the TD basically then just hits enter, next, next, next, next, for the entire show. This is my main job at work now; the care and feeding of the automation systems.
Emphasis on "hits", the sheer number of times I've had to replace that take button or the whole freaking shot box
This video brought back memories from HS TV Production. We had the NewTek Tricaster.
I've seen fast wipes used by NHK for news and emergency broadcasts. But that's the only place I have seen them. In Europe, news broadcasts use hard cuts and it might be because we see stingers as the exclusive effect of sports broadcasts.
I used one of these back in the day for live editing weddings, theater productions, and my girlfriends' figure skating videos and then offer DVDs made before the event is over, or for live-editing skateboarders that goes right to a big screen at a competition. What a throwback, you're gonna get me to buy one of these for no reason now.
"Nobody's needed one of these for almost 25 years." EMOTIONAL DAMAGE
(had to finish this later) This was such an unbelievably nice treat and trip back in time. TYVM.
I just watched the entire video. This is really stuff I've been interested in. I REALLY wouldn't be surprised if companies used this video as an actual training video, even if you mention yourself that after watching this video you might not be able get a job as a TD. I feel like this could actually be really helpful for new people in the field, to understand the basics of it.
Thanks for making this video! I can tell that it must've taken a lot of work.
I would probably edit it in reverse for a training video. This direction is better for showcasing the tech, not operating it.
The broad strokes are correct, but you can also definitely tell that this is by someone who hasn't actually worked with the equipment in real life use and there are several inaccuracies.
Also, it's not really a field of work you can get into by watching training videos. I don't know how it is in other parts of the world, but at least in my area, a bachelor degree is pretty much required to be considered for a vision mixer/TD job in broadcast.
On the Panasonic switcher, you should be able to set the time for the transition to the minimum, then use the button to cut. That was preferred on the one at our local public access station. We had one for portable setup's with two cameras.
The reason to use it as a cut button is it has a built in TBC (Time Base Corrector) for each buss. Your inputs do not need to be genlocked. A test you can do, set up two inputs that have action. If you switch between them on one buss, there will be a delay of about one or two frames as it sync's the video. That didn't happen when going between the two busses. Hopefully they fixed it on later versions, but I doubt it.
A quick edit. The built in TBC only synced the signal, no other adjustments that you could get with a good external TBC.
On an early video switcher I used at a public access TV station, you HAD to use the take button. The switcher only had two frame synchronisers and the cameras we had didn't support genlock. So if you did a hot take to another input on the program bus there would be a huge tear in the image.
TD here too! In my past life I used to TD for live horse racing, and the wipe was totally utilized a lot. We used the horizontal wipe for races about halfway.
Cam 1 up top would show a wide view of the field along the top half of the frame. Cam 2 would be tight on the first few horses on the bottom half of the frame.
Stick a horizontal wipe transition halfway between the 2,with a nice feather and what you have is the entire field of horses tight, and wide.
I would also use star wipes and the other wipes when I was bored.
OBS can totally do the trick! I just recommend creating an overlays scene and adding that scene on top of the other "CAMERAS SCENES", and then, enabling the studio mode, configuring shortcuts, and running the multiview window. Now you have a virtual video switcher! With the OBS web socket plugin, you can even have tally lights on your smartphone!
I know it CAN, but I've tried it and I find it incredibly awkward and unreliable.
@@CathodeRayDude I think that what gives that feeling is "sharing" the same keyboard to the shortcuts. It seems like you can't use the computer for something else, even though you can. Also if you're covering sports and need quick and reliable replays, definitely Vmix is the way!
And if you throw a touchscreen monitor in the mix for your multiview you now have touchable cubes.
I also use a numerical keypad for cross fades, cuts and camera selection.
@@felipeamdd I found out the hard way during testing that if you bind to a key like say A through Z you now can no-longer rename things.... As sometimes you'll get that letter and sometimes you won't but you will also be activating that shortcut.
@@imark7777777 I solved all these problems by connecting two keyboards to the computer and using a software called "HID Macros", basically you can assign macros to only one of the keyboards while the second can be used normally. We used to stream our church services with this setup, and for that, I used a full-sized keyboard for the macros. Now we have a NeoID Studio 6 video switcher, so all the heavy cutting is done when arrives at OBS and we just need to control lyrics overlays and PIP for the Bible reading. So I moved all the macros that we still use to a small numpad-only keyboard.
Director/ TD here, great video! You got everything basically right! You're correct that the millions of wipes are hardly ever used. Manual T-Bar would be used when you want a slower fade/ transition done on the fly without reprogramming your preset transition duration. Also, in some circumstances, I will leave one source half faded over the other for a duration. You're also correct that you could do a rudimentary split screen in a pinch if you wanted to, but I haven't been in a production that has done so.
For modern live broadcasts, say live basketball on ESPN, the real magic happens in the graphics computers that feed the switcher all the fancy graphics to your DSKs or MEs. The actual switching of many live shows is pretty formulaic, but the graphics are where it gets complicated.
It is surreal to watch this video less than 2 hours before I go to work in a media studio.
Someone may have already mentioned this and I just missed it, but regarding all the rackmount stuff: bear in mind it's not just vertical enclosures; there were also desks designed to hold rackmount gear horizontally or at a slight angle.
There’s so many different discussion topics that come from this video. Technique, creative style rules, deeper history and even user interface.
There’s so many points in my careers (plural) where having a control surface made for purpose would be an invaluable addition next to a mouse and keyboard.
@Doug Johnson Productions made a great video about custom controls for video work last year.
I used to work with the modern versions of this equipment while running tech at a theater. We had both a very small simple 4 input board for our projector and a larger board for our live streaming system (a God send during covid). I am so glad we are down to 1 cable per beautiful HD video signal now. I can only imagine the rats nest of wire and the absolute pain and suffering of having to diagnose any problems with those older systems.
I am so glad I'm down to less than one cable. I work for a guy who wanted to reuse what he had to do a "professional" streamed +Live event. I'm talking using 2 mevo gen2 cameras and 30+ seconds of delay hacked into OBS. (I believe when he asked me what camera to get I said mevo gen2 that was out at the time before I started working for him with the explicit they only support one camera at a time and yet). I finally talked him into getting the newer ones that supported NDI amazing. Then he wanted three cameras and hey guess what he already has a camera we could just use that right... More than the minimum 30 feet of HDMI cable couplers and headaches different cameras sensor differences.... I finally talked him into getting a third. Which means I have one ethernet cable going to the front and a POE switch that powers 3 cameras.
Tracking down a fault in a cable vault had to be maddening and then there's cross talk and interference.
55:10 I know the Stream Deck and other painfully expensive switchers exist, but if I were a high profile streamer using lots of cameras, I would love to use traditional mixers. Maybe it's for the blinkenlights and big buttons with positive feedback, but also because it just makes sense and looks cool in the process. Even if the mixer was just an interface device for software on a computer, it just makes sense. You just punch a button in a bus and you're previewing or streaming that camera, no need for a mouse or digging through menus. Sadly, from what I've seen, that's not a thing that exists, at least on a decent scale and price point for consumers.
Agreed, I'd try to find suitable little set of crt monitors to go with it, make the setup part of the show, so to speak.
I might not currently have space for such a rig but I love the idea
Those special effects generators were rack-mounted because they weren't intended to be used "live" by the TD. There'd be one or more of those mounted in a rack along with the VT decks and other equipment and it would be set up before it was needed to generate a specific effects shot like the titles, credits, or the key for the weather report. The main switcher, which might have been an analog unit that was already in the studio for years without special functions, would cut to its generated output when it was wanted. The big Sony panel that was multiple switchers in one, that was collapsing the functionality of several of those into the main board in easy reach of the TD. If what you needed was effectively extra mini-mixers to feed the main mixer, the obvious form factor was a rack mount because you might need to cascade them or make them share inputs, but then main board doing the bulk of the work would be a console-style mixer.
Software engineer here, and I super relate to the phenomenon you're describing. About things seemingly so complicated you can't comprehend them, and yet once you understand the pieces, it's actually relatively simple.
I feel like that's how all of software engineering is.
If all you ever written code wise is programming exercises like a bubble sort, writing a full software program seems impossible. But once you understand how to take everything, literally everything, and compartmentalize it... So much of my day-to-day work is about as complicated as that TD board. Metaphorically speaking
i did an apprenticeship in media production (in germany that means you work for an actual company and get paid while also going to school every couple weeks) and i was (un)lucky enough to be put in a school where all the live production equipment was from the late 90s (i graduated in 2021). you'd have a BLAST looking at all the ancient shit we had to use. there was a room with 6 19' racks all the way to the ceiling that was the backend of the control room. video switchers really do come in all sizes and shapes, nice collection. i learned operating a larger switcher with all the bells and whistles, and LEMME TELL YA doing that stuff in a live environment was one of the most stressful experiences in my life. it was fun looking back, but jeez man. im pretty sure every second hand switcher you buy had a decent sweat patina at some point in its lifetime.
Very cool stuff. I'm primarily used to tools like OBS Studio but it's cool to learn more about the tools that came before. It is funny how many awful looking transitions there are in these mixers. Stingers really seem to cover most bases. I run a live speedrun event every year and this year I'm going to be making my own with my own filmed elements - sure hope that works!
My favorite part is watching the news with all their expensive equipment get defeated by some random guy using OBS.
OBS is not reliable enough or low latency enough for pro use
OBS actually has studio mode which is based on the preview, broadcast thinking. I've understood that a big reason why people still use physical mixers instead of PCs is because chromakeying is much higher quality on dedicated hardware. I've wondered if OBS could use a full control mixer as a keyboard
I'm well aware of OBS's weaknesses, and while I'd like to get a hardware mixer someday, a) latency doesn't matter much for my projects and b) I'm not a professional. It's a pretty remarkable tool though, and the fact that an open source and free tool can do so much of what a studio mixer does is pretty amazing.
Great video! I learned a few things and I'm something of a TD myself, at least until the billion dollar corporation I work for changed my title so they can add new responsibilities without adding any pay. It's funny you point out the seat of the pants capabilities of a video mixer. The switcher I use costs half a million dollars and I never even touch it. I'm told that with the fancy system we have, I can do all kinds of things on the fly, but I feel like it's all on rails now. I do my job with a mouse and keyboard using software that automates everything. There was something romantic about having a full crew in Production Control, punching shots as the Director calls them out. Working as a team with the camera and audio operators. Yelling at the prompter to keep up. But those days are over for a lot of us. Now the cameras are all robotic and it's my responsibility to program them and keep them from crashing into each other, the more complicated the show, the more disastrous it can be to make changes once on the air. Nowadays I compose the shots, set the focus, put them on the air, run the audio, punch the graphics, cue the talent, roll the breaks, coordinate with the Hub during live network events (since they shut down our master control), sometimes I even have to time the show. Even the talent scroll their own copy with a foot pedal now. There's no one in the control room anymore but me and a producer. So I don't feel like the huge switcher we have really gives us half a million dollars of flexibility. On the other hand maybe I'm just not using it correctly, often we'll get new equipment (like robotic cameras) and literally only two days of training for it. Don't get me wrong, I love my job but it's changed a lot in the last 10 years. I feel like the last man standing. I have friends at other Stations who still run a full crew, but I'm sure their days are numbered.
the switcher isn't quite the heart of the broadcast studio anymore, that'd be the video router, which is basically just the world's largest video matrix, since so much of what gets broadcast is controlled by automation servers rather than ops, even live tv has a solid chunk of automated sections. usually you have every single imaginable video source and destination connected to the router (the biggest baseband one ive ever seen took up two entire 42U racks and had something like 2048I/O, and the smpte 2110 ones can get even denser) and then the switcher gets only a subset of those, maybe 12 or 24i/o as relevant to the program.
we have a (small) 512 squared SDI router, and a couple of 64 and 128 squared routers for QC monitoring. I worked for a network broadcaster that had a 2k square video router!
Routers can be huge, but I would argue the switcher is still the hearts. Routers are one-trick ponys and can only do cuts (and not always clean cuts) so you actually want to transition between sources through a switcher. Most of the cool things you want to do (USK, DSK, fades, clean cuts) are coordinated by the switcher.
Absolutely love the videos about the different cameras and what makes them unique. Not necessarily interested in that stuff as a collector or whatever... But I've enjoyed them thoroughly and rewatched your videos several times. The camera and gear videos are probably my favorite
30:20 sidenote: ever saw a PowerPoint presentation made by someone who just discovered how to use transitions and every slides have different transitions?
Professional TD here, mainly on Grass and Ross switchers.
WIPES are kind of a misnomer anymore. We don't use them to transition between sources much. We use them all the time to create effects during preproduction. Not to actually wipe between sources during a shot, but you can create a circle wipe to wipe between a slightly darker version of an image to create a spotlight effect or I used it one time during an ESPN football show to put a circle around our location on a radar to show that there was a big storm cell about to pass over us. When you combine a wipe with an effects dissolve on an E-MEM, you can even have all of this animate. I have a couple of moves in my show just in case I need to pull out an animated vignette really quick or something.
This might be a hot take or completely true, but I have the feeling that OBS could be modified to be an entire production movie mixer. All it needs is a way to get a lot of sources in and a control deck.
I think that's incredible considering it's free software. I could see entire productions being done on OBS in the near future, and to and extent I could argue that's already happening.
I will say however that reliability is nowhere near as good as production mixers, but it's certainly a lot better than any other option out there for low cost productions , at least at the moment
The problem with OBS in my experience is that it's too stupid to understand how to switch sources reliably. In a mixer, there's an abstraction layer. All the sources are constantly being ingested into framebuffers, so you can pick whichever one you like. In OBS, if you switch inputs on a source, it takes 2-4 seconds to initialize a new video device; and if you try to use the same source in two places, you can wedge the whole app. It's just not designed for it, and while you could _sort of, kind of_ hammer it into submission, it really just isn't built for this.
@@CathodeRayDude there's nothing saying it couldn't be redesigned to do things like that(open source software invites that), there are also dedicated PCIe capture cards, but it seems like the way it is now is not ready to be used for something that important. Although I could recon the "computer bs" could be fixed if you modified the OS to be dedicated for streaming and nothing else, but at that point it may be better to use something else, not entirely sure. I know Linux was used a lot for computer aided DJing and the first ones used a custom Linux distro designed to be dual-booted with windows xp. I could assume at some point someone has locked down their windows stream box or used Ubuntu studio to get more reliability, but OBS is OBS.
@@JessicaFEREM It could probably be possible, but would mean a redesign from the ground up. Big OBS updates are enough work already, you would need to have a paid dozen of programmers to realistically make such a big change in a reasonable amount of time. And you're probably right about the OS too. Windows has a lot of overhead, but Linux has also a lot of overhead in comparison to dedicated FPGAs. That's why there's a market for hardware mixers. You can strip down Linux to its bare parts, but you can build something else better anyway in that case. But I'm not a software engineer (yet) and I have no experience in OS development. I was a TD in high school, and because the productions were so small there were a bunch of things that weren't industry standard.
@@xWood4000 that's good to know, thanks!
Thank you for taking time to Malle this video. Very fun to watch and very informative as I look into streaming and potentially working with similar gear.
engagement!
Actually most mixers in the consumer range could not make dissolves as it requires quite complex circuitry to do that. They could fade or dip to colors or maybe even had alfa matte superimpose that could be changed in color. However mixing two video sources required having a sync signal from the vtr or camera or they digitalized the imput feed like the Panasonic WJ-MX series.
Synchronizing both frequencies was what made these things so extremely expensive at the time. I remember the cheapest consumer video mixer that could dissolve between two cameras to cost over 2500€.
However i really like watching your videos about old tv equipment!
Someone should tell George Lucas about wipes😂😂
This is the most comprehensive video mixer video on RUclips. I just got a Sony MCS-8M you explained everything I needed to know about this subject. This mixer actually has an 8 track audio mixer.
You make the most wholesome and fun videos thank you! This takes me back to local tv shows about tech way back in the 90s. But also with the 'Art Attack' vibe. You're great CRD.
The mixer you used in your demo looks so nice. I love it!
Your explanation about being interested in various professional tech is spot on. I worked with a lot of 80s and 90s-era broadcast video equipment in high school...including I think at least three of the switchers you show.
It’s really interesting seeing the similarities between stuff like this and the Crestron AV equipment I’m installing at work. It’s wild.
Also, I explained to one of my friends that my hobbies include learning about other people’s careers.
I'm from Grass Valley where Grass Valley Group mixers were produced and made. I currently work in a business that has a museum of their equipment and instantly recognized some examples. I'm extremely interested in this type of equipment and it's awesome to see someone cover this.
Enter the lowly timebase synchroniser and corrector, Genlock, and Sc/H colour frame lock discussion!! Not to mention non-composite and composite sync, blanking, subcarrier, video matching and DA equalisation. Long live the green tweakers!
I like watching people who are passionate about things explain stuff- even if I never really wanted to know its just cool how much knowledge they have about one specific thing
The wipes are still there and mostly used (German TD [here we are called BiMi, Bild Mischer meaning picture mixer]) to transition thru a stinger that does not completely occlud the frame. So if we have a stinger that is max ocluding 30% of the image the desk transitions (automated) thru it
Your little rant on Vmix reminded me about one my favourite lines at work which also is similar to something else you slipped in earlier but uh “all my homies hate WIDEORBIT!”
I am a TD as well. I’ve been doing it for almost 15 years on « big brother » style shows, which makes me one - if not the most - of the maybe 10 experienced guy at this now in France. I also started a few years ago to do live streaming (mostly political because I am ;) and I acquired my gear on this occasion (BMD is my brand of choice).
I enjoyed your video a lot. It’s a good one to explain those kinda frightening control surfaces to people unfamiliar with it. It’s actually quite simple. To me the trick is more in the doing : which angle and why at every point is what actually makes my job exciting. This and playing with live characters who do not cooperate, as they’re living their life and it’s my job to adapt to it : this is what my core job is and I love it. I think I don’t totally stink at it too, cause I’m still here doing it !
I think to answer your question « why those transitions? » that it’s there because it’s better to have it and not use it, than to not have it and need it. Parts of those transitions I’m sure are using the same systems used for effect commonly used, so why not put all the buttons there just in case.
As a kid, I was given full control of one of those Panasonic units for weekly church programs and irregular public access TV work. One of the larger churches in my area had a direct feed to the CATV headend (which actually served two competing cable networks) and I’d use their gear to do live shows. We quite regularly would film multiple cameras offsite and then playback for a live mix. It was grand fun.
You and I are in the same boat, figuring out how to cover at the right depth! Keep up the good work!
That Panasonic WJ4600 is the one I learned on at a cable access station back in the late '80s.
Brings back memories of a lot of crazy fun low budget productions!
Great video man, I can tell you and your friends are having a good time producing these, this was funnier than I was expecting and loved it. Also really appreciate the references to older videos about "how things work", like the Nimoy one. Keep hoping you're teasing the HVX for a video coming up on it. It is truly an amazing camera. Thanks again
I cut my teeth using a Panasonic WV-5600, a double T-bar bus version of your 4600. It had a CUT button, and was designed to be inlayed into a table top despite being a standard rackmount device. It had some silly analog effects and chroma keying , but was a pretty straight forward SEG for running a 3 camera production using WV-5100 studio cameras. It was also the HEART of a the TV studio, providing the master genlock sync to every device there from the cameras to the TBCs for the VTR output.
I have also used the WJ-MX50. It was mostly marketed by Panasonic for use in A-B roll VTR editing systems. The setup we had was 2x AG-DS545 SVHS feeders, one AG-DS555 recorder, a AG-A750 editing controller, and a WJ-MX50 connected via RS-422. Through the magic of buying more feeders and timecode, you could record all your camera feeds to tape and later edit it like a live production.