Phillip Moss
Phillip Moss
  • Видео 1
  • Просмотров 187 388
How we made a Comedy Series for the BBC using Virtual Production
This video is a quick overview of how we got into Virtual Production and some things we learnt along the way.
Age of Outrage is the BBC Wales comedy series we made - you can see it on BBC iPlayer in the UK:
www.bbc.co.uk/iplayer/episodes/m0011mct/age-of-outrage
After we finished the series, we were helped to do more research into Virtual Production by Clwstwr, who support innovation in the tech sector in Wales:
clwstwr.org.uk/
Some of the clever people we learnt from on RUclips:
Richard Frantzén, Matt Workman (Cinematography Database), Aiden Wilson, Greg Corson, Pixel Prof.
Просмотров: 187 405

Видео

Комментарии

  • @abidkarim6789
    @abidkarim6789 5 месяцев назад

    This is priceless <3

  • @abidkarim6789
    @abidkarim6789 5 месяцев назад

    Thanks dear god, I struck gold today ! Huge love and respect to you guys for Sharing this !

  • @talkingoleg
    @talkingoleg 6 месяцев назад

    Thanks for sharing your journey!

    • @talkingoleg
      @talkingoleg 6 месяцев назад

      How is this trick with qr code and camera positioning called? Cant find it anywhere

    • @pogglemoose
      @pogglemoose 4 месяца назад

      Greg Corson has done a couple of tutorials on this now.

  • @GlenReed
    @GlenReed 7 месяцев назад

    Thanks for sharing! I've been trying to figure this whole thing out and there isn't much straight forward info out there.

  • @Dominik-K
    @Dominik-K 10 месяцев назад

    Thanks a bunch for the video, really amazing! I want to dabble in such productions myself, but it'll still be a while until I can do so

  • @rquwyt
    @rquwyt 11 месяцев назад

    Underrated

  • @pixelasm
    @pixelasm Год назад

    Thanks for also sharing some timings, which gives a better feeling for how long a start in virtual production really takes.

  • @mahmoudzaefi2958
    @mahmoudzaefi2958 Год назад

    thanks

  • @netozneto
    @netozneto Год назад

    amazing

  • @CsharpProgramming
    @CsharpProgramming Год назад

    Part 2 please!

  • @haxstat
    @haxstat Год назад

    The movement of the tripod caught me off guard.

  • @jasonrjohnston
    @jasonrjohnston Год назад

    And Jim Henson was revolutionizing the virtual production studio way back in 1987 for The Storyteller, and in 1989 for The Jim Henson Hour. Kind of amazing how far we've come in making the tech smaller-better-faster-cheaper, but the results from using chroma screen look exactly the same today, but in 4K. Volumetric studios are the best virtual ones, as rear projection work has always been better than chroma screens, and now we're taking it further with video walls and tech that tracks the camera in near real-time. Very exciting stuff! Thanks for this generous BTS video!

  • @Shakespierre
    @Shakespierre Год назад

    Hi there! Great work!! Do you use the Intel RealSense WITH the Hive setup?

  • @adam_the_kiwi
    @adam_the_kiwi Год назад

    Hey great video & breaks it down quite nicely. I'm looking to do something similar, can you please advise on where you got your green screen backdrop from & that set up? Many thanks :)

  • @MasonPollock
    @MasonPollock Год назад

    One of my limited suggestions is you could possibly reduce the stops of dynamic range in unreal so you can record the unreal in a log format then better manipulate in post by matching the 18 percent greys.

  • @mysticfakir2029
    @mysticfakir2029 Год назад

    If only the HTC Mars was available a couple of years ago

  • @ooohlookitshines3944
    @ooohlookitshines3944 Год назад

    We need to collab. Trust me

  • @bazzahill6182
    @bazzahill6182 Год назад

    Really good video. One very cheap option for camera matching is to control the real camera with the virtual camera. A pan and tilt on a tripod controlled with an arduino uno then mapped to the scene camera is simple but effective, certainly as a rig to "play with" as a learning exercise. I use Blender and OBS studio.

  • @AnthonyBatt
    @AnthonyBatt Год назад

    Bravo

  • @stuartrobb673
    @stuartrobb673 Год назад

    Now that intel have stopped supporting their SDK and real sense, what have you replaced them with, or are you still using them?

    • @pogglemoose
      @pogglemoose Год назад

      I have not used it but ReTracker Bliss is now an alternative

  • @weshootfilms
    @weshootfilms Год назад

    What is that green material you used for the floor been trying to find something like that

    • @pogglemoose
      @pogglemoose Год назад

      Cheap green, really thin carpet - supplied by the same guy who installed the green screen

  • @sidneydore
    @sidneydore Год назад

    Using a camera such as RED komodo how you would suggest to ingest de virtual scenario in real time? what capture device would you use? is it possible with the blackmagic 12g hdr video assist? and how many trackers are recommended fora a good result?

  • @arupsan
    @arupsan Год назад

    Many thanks for the explanations …

  • @pfuisi
    @pfuisi Год назад

    Well done guys. Fun to watch and the tracker hint is nice too. Cheers

  • @jaerichards
    @jaerichards Год назад

    This video just changed my life

  • @ThadeousM
    @ThadeousM Год назад

    Very inspiring stuff! Always great to see piers at the BBC pushing modern practises. I hope to shake your hand one day

  • @MaruHieta
    @MaruHieta Год назад

    Inspiring and cool!

  • @martin_minds
    @martin_minds Год назад

    thank you very much for sharing this. good luck with your show.

  • @iknahid
    @iknahid Год назад

    Thanks for the video. Very informative

  • @FrancoRicchiuti-dj5iv
    @FrancoRicchiuti-dj5iv Год назад

    This is simply Great! Thank you for sharing your journey!!!

  • @Jr3n660
    @Jr3n660 Год назад

    Amazing....👍

  • @GenreFluid
    @GenreFluid Год назад

    I would love to see this done for a virtual concert?

  • @OlafSperwer
    @OlafSperwer Год назад

    Honestly - You guys need a LED backdrop and floor - get rid of green spill and use LED green

    • @pogglemoose
      @pogglemoose Год назад

      Can't quite see how using LED green would reduce spill?

  • @DGFA-
    @DGFA- 2 года назад

    Hi, it would be great when you could say how did you measure the tracker (T265) offset. Thank you in advance!

    • @pogglemoose
      @pogglemoose Год назад

      We used the Unreal Lens Calibration Tool to calculate nodal offset. You can also simply measure the distances from the nodal point of the lens and offset your virtual camera using an actor in the camera chain which is just there for the offset.

  • @FutureOfCinema
    @FutureOfCinema 2 года назад

    Thanks for this super video! There's plenty of advices (maybe all xD) to start working in virtual production from scratches eheh

  • @EricLefebvrePhotography
    @EricLefebvrePhotography 2 года назад

    Your issues with motion are strange. Ryan Conolly from Triune / filmRiot has made a few tests with Unreal with tons of handheld camera motion and it looks amazing!!! ruclips.net/video/lRQvi5Jc_Js/видео.html ruclips.net/video/M-AV-bctI1o/видео.html ruclips.net/video/ZJzcO_eXqgI/видео.html

  • @ThinkCMYK
    @ThinkCMYK 2 года назад

    how much can basic studio cost?

  • @ZachHendrixVoiceOvers
    @ZachHendrixVoiceOvers 2 года назад

    I bought everything that you guys used for your setup. However you mentioned how quickly the technology has changed in example would be not needing the headset. The tutorials I'm watching all seem dated, where can I find the most up-to-date info?

  • @Arima_youtube
    @Arima_youtube 2 года назад

    Слава Україні!

  • @Michasha9
    @Michasha9 2 года назад

    Brilliant!

  • @skyridestudios
    @skyridestudios 2 года назад

    @Phillip Moss Can you please do a tutorial on the Aruco markers and setting them up in Unreal Engine? I can't find much info about it and the documentation is a bit hard to understand.

  • @sveng7096
    @sveng7096 2 года назад

    Thanks for sharing! Really inspiring.

  • @Dante02d12
    @Dante02d12 2 года назад

    The jittering is absolutely not a limitation of the VR system. I don't know what caused this in your case, but you can definitely get a smooth image with the Vive tracker on your camera. Anyway, this was terrific! Virtual Production evolves so quickly, it's astounding! Last year we had to manually calibrate the camera, now we have a simple, almost automatic calibration system. Here's a few advice : - Virtual Production is even more interesting when you move the camera. For that, you need to make sure, in the UE scene, that the virtual screen (the rectangle on which you project the actor) keeps facing the camera. That way you can shoot scenes with a moving camera, instead of just fixed shots. - You can get shadows automatically with the proper material settings on the virtual screen. No need to put it on each frame manually! The limitation to that though is the shadow is projecting a 2D surface, so there's an angle at which point it doesn't feel like the natural shadow anymore ; but that's just a limitation to keep in mind when working on the lights! Aside from that, you get real-time accurate shadows.

  • @onur.nidayi.1344
    @onur.nidayi.1344 2 года назад

    Amazing! , this is really inspiring :)

  • @totallyrandomtalk8303
    @totallyrandomtalk8303 2 года назад

    Great video would this work version 1.0 station? trackers 2.0 seems sold out everywhere

  • @mullinsmediaLLC
    @mullinsmediaLLC 2 года назад

    this is so inspiring

  • @Megasteakman
    @Megasteakman 2 года назад

    This was an awesome video! Great overview of the process: very informative! I definitely want to make a show like this, but I ditched greenscreen for mocap a few years ago.

  • @barrybraley7562
    @barrybraley7562 2 года назад

    This is fantastic, does this tech work on a Blackmagic 6k or 6k pro?

    • @pogglemoose
      @pogglemoose 2 года назад

      It will work with any camera really. You just need to tell Unreal what the real camera's focal length and sensor size is so that it can match the real and the virtual cameras. Plus measure the offset from the tracker to the sensor. The 6k only has HDMI live output I think, but if you are only using this to make a guide composite shot, you could use a HDMI-USB input device to get the pictures into OBS. The bigger cameras like the Ursa Mini Pro or an FS7 have SDI outputs and also the ability to be Genlocked, which means that all the devices in the system are synced. This means they all start their frames at the same time. More important for LED volumes than greenscreen, though we did genlock our system and it worked better.

  • @JoshuaPaxton
    @JoshuaPaxton 2 года назад

    Amazing!

  • @Zonuna_Chawngthu
    @Zonuna_Chawngthu 2 года назад

    Awesome, is there any possibilities to apply this on an existing live multicamera production studio ?

    • @pogglemoose
      @pogglemoose 2 года назад

      With our set-up it would be relatively easy to do more than one camera. So for a 3 camera set-up we would need one Vive tracker per camera, and separate feeds from the three virtual cameras out of Unreal. The limitation is the number of video inputs and outputs you can get in one PC - we would need three outputs from Unreal, then these become inputs to OBS alongside the three real cameras, plus an output from OBS, which is ten in/out video slots. You can make it work outside of a PC with a vision mixer like a Blackmagic ATEM Mini Pro Extreme which has four chroma keyers, or use a plug-in like Spout which will generate the video outputs from Unreal direct into OBS without the need to go out of the PC and back in again. That way you could get a 3-camera multicam working with our one Decklink card, which has four in/out slots.

    • @Zonuna_Chawngthu
      @Zonuna_Chawngthu 2 года назад

      @@pogglemoose Thanks, that would mean a 3 unreal PCs for each camera and then each output goes to Video switcher