- Видео 1
- Просмотров 187 388
Phillip Moss
Добавлен 2 дек 2012
How we made a Comedy Series for the BBC using Virtual Production
This video is a quick overview of how we got into Virtual Production and some things we learnt along the way.
Age of Outrage is the BBC Wales comedy series we made - you can see it on BBC iPlayer in the UK:
www.bbc.co.uk/iplayer/episodes/m0011mct/age-of-outrage
After we finished the series, we were helped to do more research into Virtual Production by Clwstwr, who support innovation in the tech sector in Wales:
clwstwr.org.uk/
Some of the clever people we learnt from on RUclips:
Richard Frantzén, Matt Workman (Cinematography Database), Aiden Wilson, Greg Corson, Pixel Prof.
Age of Outrage is the BBC Wales comedy series we made - you can see it on BBC iPlayer in the UK:
www.bbc.co.uk/iplayer/episodes/m0011mct/age-of-outrage
After we finished the series, we were helped to do more research into Virtual Production by Clwstwr, who support innovation in the tech sector in Wales:
clwstwr.org.uk/
Some of the clever people we learnt from on RUclips:
Richard Frantzén, Matt Workman (Cinematography Database), Aiden Wilson, Greg Corson, Pixel Prof.
Просмотров: 187 405
This is priceless <3
Thanks dear god, I struck gold today ! Huge love and respect to you guys for Sharing this !
Thanks for sharing your journey!
How is this trick with qr code and camera positioning called? Cant find it anywhere
Greg Corson has done a couple of tutorials on this now.
Thanks for sharing! I've been trying to figure this whole thing out and there isn't much straight forward info out there.
Thanks a bunch for the video, really amazing! I want to dabble in such productions myself, but it'll still be a while until I can do so
Underrated
Thanks for also sharing some timings, which gives a better feeling for how long a start in virtual production really takes.
thanks
amazing
Part 2 please!
The movement of the tripod caught me off guard.
And Jim Henson was revolutionizing the virtual production studio way back in 1987 for The Storyteller, and in 1989 for The Jim Henson Hour. Kind of amazing how far we've come in making the tech smaller-better-faster-cheaper, but the results from using chroma screen look exactly the same today, but in 4K. Volumetric studios are the best virtual ones, as rear projection work has always been better than chroma screens, and now we're taking it further with video walls and tech that tracks the camera in near real-time. Very exciting stuff! Thanks for this generous BTS video!
Hi there! Great work!! Do you use the Intel RealSense WITH the Hive setup?
Hey great video & breaks it down quite nicely. I'm looking to do something similar, can you please advise on where you got your green screen backdrop from & that set up? Many thanks :)
One of my limited suggestions is you could possibly reduce the stops of dynamic range in unreal so you can record the unreal in a log format then better manipulate in post by matching the 18 percent greys.
If only the HTC Mars was available a couple of years ago
We need to collab. Trust me
Really good video. One very cheap option for camera matching is to control the real camera with the virtual camera. A pan and tilt on a tripod controlled with an arduino uno then mapped to the scene camera is simple but effective, certainly as a rig to "play with" as a learning exercise. I use Blender and OBS studio.
Bravo
Now that intel have stopped supporting their SDK and real sense, what have you replaced them with, or are you still using them?
I have not used it but ReTracker Bliss is now an alternative
What is that green material you used for the floor been trying to find something like that
Cheap green, really thin carpet - supplied by the same guy who installed the green screen
Using a camera such as RED komodo how you would suggest to ingest de virtual scenario in real time? what capture device would you use? is it possible with the blackmagic 12g hdr video assist? and how many trackers are recommended fora a good result?
Many thanks for the explanations …
Well done guys. Fun to watch and the tracker hint is nice too. Cheers
This video just changed my life
Very inspiring stuff! Always great to see piers at the BBC pushing modern practises. I hope to shake your hand one day
Inspiring and cool!
thank you very much for sharing this. good luck with your show.
Thanks for the video. Very informative
This is simply Great! Thank you for sharing your journey!!!
Amazing....👍
I would love to see this done for a virtual concert?
Honestly - You guys need a LED backdrop and floor - get rid of green spill and use LED green
Can't quite see how using LED green would reduce spill?
Hi, it would be great when you could say how did you measure the tracker (T265) offset. Thank you in advance!
We used the Unreal Lens Calibration Tool to calculate nodal offset. You can also simply measure the distances from the nodal point of the lens and offset your virtual camera using an actor in the camera chain which is just there for the offset.
Thanks for this super video! There's plenty of advices (maybe all xD) to start working in virtual production from scratches eheh
Your issues with motion are strange. Ryan Conolly from Triune / filmRiot has made a few tests with Unreal with tons of handheld camera motion and it looks amazing!!! ruclips.net/video/lRQvi5Jc_Js/видео.html ruclips.net/video/M-AV-bctI1o/видео.html ruclips.net/video/ZJzcO_eXqgI/видео.html
how much can basic studio cost?
I bought everything that you guys used for your setup. However you mentioned how quickly the technology has changed in example would be not needing the headset. The tutorials I'm watching all seem dated, where can I find the most up-to-date info?
Слава Україні!
Brilliant!
@Phillip Moss Can you please do a tutorial on the Aruco markers and setting them up in Unreal Engine? I can't find much info about it and the documentation is a bit hard to understand.
Thanks for sharing! Really inspiring.
The jittering is absolutely not a limitation of the VR system. I don't know what caused this in your case, but you can definitely get a smooth image with the Vive tracker on your camera. Anyway, this was terrific! Virtual Production evolves so quickly, it's astounding! Last year we had to manually calibrate the camera, now we have a simple, almost automatic calibration system. Here's a few advice : - Virtual Production is even more interesting when you move the camera. For that, you need to make sure, in the UE scene, that the virtual screen (the rectangle on which you project the actor) keeps facing the camera. That way you can shoot scenes with a moving camera, instead of just fixed shots. - You can get shadows automatically with the proper material settings on the virtual screen. No need to put it on each frame manually! The limitation to that though is the shadow is projecting a 2D surface, so there's an angle at which point it doesn't feel like the natural shadow anymore ; but that's just a limitation to keep in mind when working on the lights! Aside from that, you get real-time accurate shadows.
Amazing! , this is really inspiring :)
Great video would this work version 1.0 station? trackers 2.0 seems sold out everywhere
this is so inspiring
This was an awesome video! Great overview of the process: very informative! I definitely want to make a show like this, but I ditched greenscreen for mocap a few years ago.
This is fantastic, does this tech work on a Blackmagic 6k or 6k pro?
It will work with any camera really. You just need to tell Unreal what the real camera's focal length and sensor size is so that it can match the real and the virtual cameras. Plus measure the offset from the tracker to the sensor. The 6k only has HDMI live output I think, but if you are only using this to make a guide composite shot, you could use a HDMI-USB input device to get the pictures into OBS. The bigger cameras like the Ursa Mini Pro or an FS7 have SDI outputs and also the ability to be Genlocked, which means that all the devices in the system are synced. This means they all start their frames at the same time. More important for LED volumes than greenscreen, though we did genlock our system and it worked better.
Amazing!
Awesome, is there any possibilities to apply this on an existing live multicamera production studio ?
With our set-up it would be relatively easy to do more than one camera. So for a 3 camera set-up we would need one Vive tracker per camera, and separate feeds from the three virtual cameras out of Unreal. The limitation is the number of video inputs and outputs you can get in one PC - we would need three outputs from Unreal, then these become inputs to OBS alongside the three real cameras, plus an output from OBS, which is ten in/out video slots. You can make it work outside of a PC with a vision mixer like a Blackmagic ATEM Mini Pro Extreme which has four chroma keyers, or use a plug-in like Spout which will generate the video outputs from Unreal direct into OBS without the need to go out of the PC and back in again. That way you could get a 3-camera multicam working with our one Decklink card, which has four in/out slots.
@@pogglemoose Thanks, that would mean a 3 unreal PCs for each camera and then each output goes to Video switcher