Virtual Production - Tutorial 5 Use Your Own Sets

Поделиться
HTML-код
  • Опубликовано: 29 янв 2025

Комментарии • 48

  • @paolociccone4024
    @paolociccone4024 3 года назад

    Thank you again for all that you do for the community, it's very much appreciated.

  • @schuriktube
    @schuriktube 4 года назад +1

    Thanks Greg, really grateful for your video tutorials.

  • @resetmatrix
    @resetmatrix 4 года назад +1

    As always, very clear and clearly explained tutorial, thanks Greg!

  • @Jsfilmz
    @Jsfilmz 4 года назад

    greg how did you succesfully apply composure to a plane to chroma key? Whats the secret? I see you have a blueprint titled flatscreen. Only reason why im asking is I dont have a live camera. I only have a greenscreen footage

    • @GregCorson
      @GregCorson  4 года назад

      I'm not sure what problem you are having. The media comes in to composure through the "media plate", you can use pretty much any media as a source, like a webcam or playback from a media player in Unreal. There really is no trick to it, if you are using a webcam or media player though, you have to have a blueprint somewhere that "opens" and starts the media playing. In my project the "flat greenscreen" object is used to create a garbage matte so that things beyond the edges of the greenscreen will be masked out. If you don't have a tracked camera, you may want to turn the garbage matte off because without a tracked camera it will probably not show up in the right place. I hope this answers your question, if I didn't get it right please feel free to write back.

    • @Jsfilmz
      @Jsfilmz 4 года назад

      @@GregCorson greg thank you so much for the quick reply. I was able to use my offline greenscreen image sequence in your project but now I am trying to figure out how you were able to put your greenscreen footage in the 3d space. Mine is just 2d so when i move the camera, the greenscreen composite just follows it around. I am trying to achieve something like this. ruclips.net/video/cOlKyx_1dUw/видео.html. Do you see how the green screen is inside the space? Thank you again so much.

    • @GregCorson
      @GregCorson  4 года назад

      You can get the example for the first scene in your link by downloading the "virtual studio" example from the epic launcher "learn" tab. That one puts the live camera input as a texture on rectangle that's located in the 3d scene. It doesn't track the live camera so you can only move the CG camera a bit before the illusion breaks down. There is a new plugin in Unreal 4.25 called "composite plane" which can help do this, unfortunately there is no documentation on how to use it just yet. In most of my stuff, a VIVE tracker is being used to feed the motion of the camera into unreal in real time. The CG camera moves exactly the same as the live camera, so they always match up. If you don't have a live camera and tracker, it may be able to "track" your recorded footage using something like the camera tracker in blender and export it into Unreal but this is a bit complicated and probably not a good place to start if you're doing it for the first time. I haven't done it yet.

    • @Jsfilmz
      @Jsfilmz 4 года назад

      Greg Corson that makes sense it only works because you have a vive tracker acting as a camera oh well i know a method that works but its tidious hahah. Thank you i guess ill wait for more docs

    • @GregCorson
      @GregCorson  4 года назад

      If you are only working with pre-recorded footage, you might want to check out the camera tracker in Blender.org which is free. As far as setups where you don't plan to be moving the camera, you don't really need a tracker, you can line up everything by eye and it will work fine.

  • @ivanmarcos7965
    @ivanmarcos7965 4 года назад

    Hi Greg, in your project I see you have mapped 'N' as 'NextTalentMark' - What is this in reference to as we cannot get this input to work in our own project?

    • @GregCorson
      @GregCorson  4 года назад

      In my project "talent marks" are the spots in the level where you want your talent to appear. You can have many of these and hitting N will teleport your talent to the location of the next talent mark in the level. Does this answer the question or are you saying that the talent marks aren't working for you?

    • @ivanmarcos7965
      @ivanmarcos7965 4 года назад

      @@GregCorson What I was trying to get is to modify the action call on N, but I can find the blueprint where N triggers the 'NextTalentMark'?

    • @GregCorson
      @GregCorson  4 года назад

      If you want to bind some other key to "Next Talent Mark" that is done in the project settings input section. You can have any number of things trigger the "next talent mark" input action, keyboard, mouse clicks, controller buttons...etc. If you want to create completely new keyboard keys that trigger your own blueprints, you do it there too. This tutorial has a section on creating an input action and connecting it to a blueprint. docs.unrealengine.com/en-US/Gameplay/HowTo/SetUpInput/Blueprints/index.html If you are looking for the actual Blueprint code in VPStudio that switches between talent marks, that is in VPPlayerController. It's pretty well commented. There are some things tied to "event begin play" that setup an array of talent marks. Later down in that blueprint there is the actual code that changes talent marks. This is a bit involved because you have to move all the items attached to the talent mark (cameras, green-screen...etc) to the new talent mark when you change.

    • @ivanmarcos7965
      @ivanmarcos7965 4 года назад

      @@GregCorson I see, yes its the VPStudio blueprint that I was looking for. thanks again. Ivan

  • @懒人-b5d
    @懒人-b5d 4 года назад

    Thank you for your help,Learned a lot,I downloaded,study

  • @nightwatcherfilms9485
    @nightwatcherfilms9485 3 года назад

    Hi Greg are u able to show any tutorials on how to get live actors to appear inside the UE4 virtual sets? I'd like to do some movie type scenes with actors and want to use virtual sets instead of real sets. Is the quality possible pretty realistic with just off-the shelf hardware and a black magic camera? thnx - Mark C

    • @GregCorson
      @GregCorson  3 года назад +1

      All the demo videos on the channel were done with my VPStudio sample. The basic setup is for actors on a virtual set like you want. As far as getting realistic results, this really isn't about the hardware. You need to have a realistic 3d set, good lighting on the actors and good camera skills to get everything to look realistic. In my demos I'm working in a pretty small 11 foot square space so my quality is frequently hurt because I don't have room to really put the lights, actors and camera in the right places.

    • @nightwatcherfilms9485
      @nightwatcherfilms9485 3 года назад

      @@GregCorson Thank you Greg.

  • @dcontal1
    @dcontal1 4 года назад +1

    thank you Greg!

  • @DataMakesItWork
    @DataMakesItWork 4 года назад

    Thank you so much for all your efforts so far. It is really helped to learn the ins and outs of the program within the virtual production world. However, I have a question for you. So I have a set of cinema primes. Does it make the most sense to map each individual lens to a entire camera rig and then attach that to a puck, Or could there be a better way to switch out lenses?

    • @GregCorson
      @GregCorson  4 года назад +1

      I think you could have one of my unreal camera rigs for each lens, or you could make a rig for just the camera/tracker and another rig for the lens that you attach to the camera rig inside unreal. However the rig isn't the main thing when swapping lenses, it only provides the offset from the base of the tracker to the lens entrance pupil. So you could just use a common rig and supply the right entrance pupil when you change lenses. The real thing you have to switch is all the lens parameters (focal length, f-stop, focus...etc) which live in the VPCamera objects. You could have different VPCamera objects setup for each lens and swap them on and off the Unreal camera rig when you change. There may be a better way to setup the data for each lens and swap them with a blueprint but I'm not sure if all the camera parameters are accessible to blueprints.

  • @ngochoaitran3788
    @ngochoaitran3788 4 года назад

    Dear Sir, I download studio on Market UE and I use UE4.25 to open and see map , but I use UE 4.24 ( use Aximmetry) I open Map and don’t see any map . I try use Migrate to save new content but when I use UE4 and don’t see Map. How to open project 4.25 in UE 4.24. Thanks Sir.

    • @GregCorson
      @GregCorson  4 года назад

      Most marketplace content works with a lot of different UE engine versions, but you have to tell it which version to use when you create the project. For the "virtual studio" map you hit the "create project" button and get a "choose project name and location" box. In the middle there is a "select version of the engine for project" with a pop-up menu you can use to select any engine version. The virtual studio project goes back all the way to 4.20. So if you need 4.24, just select it when you make the project.

  • @CaptainSnackbar
    @CaptainSnackbar 4 года назад

    Thanks Greg it worked for me

    • @GregCorson
      @GregCorson  4 года назад

      Glad to hear this. I'm finding that you can put things like different composure setups in their own sublevels too, which makes them easier to swap out. Like if you want one setup for single camera and another for 2 camera. You can also put your "talent marker" setup for a particular set in a sublevel to keep it separate from the set itself.

    • @CaptainSnackbar
      @CaptainSnackbar 4 года назад

      @@GregCorson already had a successful test in adding level to an nDisplay project that works with VRPN still having issues with parallax accuracy in large scale studio

    • @GregCorson
      @GregCorson  4 года назад

      Yes, paralax is still a tough one to get exactly right because the Entrance Pupil position of the lens can move when you change focus or zoom the lens. Also the effects of tracker jitter and incorrect entrance pupil get more noticeable if you use a longer focal length lens (narrow fov).

  • @otegadamagic
    @otegadamagic 4 года назад

    Thanks a lot for this tutorial. I'm really interested in virtual production but totally noob to VP and unreal but into filmmaking and use softwares like blender for 3d stuffs. I'm looking to see how portable one can get from the hardware point of view so I want to know if a laptop with good graphics can be used for virtual production and if u can recommend specs. Thanks

    • @GregCorson
      @GregCorson  4 года назад +1

      Yes, you can use a laptop but it may limit you in the video in/out and graphics card. The supported “pro” video in/out cards are all pci. You can use usb ones but some features will be missing. Same thing for graphics cards. Laptop ones are not as fast so you won’t be able to get as complex scenes. A laptop should be fine for experimental stuff and learning but if you are trying to produce real pro stuff you should look at a system with some pci slots. These days there are quite a few micro systems that would give you this and still be quite portable.

    • @otegadamagic
      @otegadamagic 4 года назад

      @@GregCorson og i see! thanks a lot. I have a budget of $1700-1800. and i plan to also use the same computer for editing 6k videos from my pocket 6k. is it possible to buy a build a pc for my needs and if possible can u suggest part links?

    • @GregCorson
      @GregCorson  4 года назад

      It is really hard to recommend something because almost anything will work and the best configuration for you depends SO much on exactly what you will be doing. How far you can stretch your budget depends a lot on where you are and if you want to build it yourself or have someone do it for you. I have scratch built a lot of PCs but lately I've been letting other people build them so I get a guarantee/warranty. For this kind of work I would suggest 32gb of RAM and an M2 SSD (I have a 1tb one in all my machines). M2 SSDs are superfast and really help when editing video. Then you can get a large budget priced hard drive to back up everything on and archive stuff when you're done working on it. I'd suggest the latest Intel or AMD CPU and graphics card, the fastest you can afford as this will help with both video and Unreal. You will want to get something that has a slot and power supply for a high-end graphics card and at least one more PCI slot for video in/out cards. What type of slot (1x, 4x, 8x...etc) will depend on the video in/out card you want to use. A common problem is the power supply, if you buy a desktop with plans on upgrading the parts later, be sure the power supply is big enough to handle what you will eventually buy. At my office they used to buy a low end office machine and then put in new cards, but some cards sucked so much power they caused power supplies to fail. So add up the requirements of everything you might eventually buy and make sure the power supply is big enough to handle it.

    • @otegadamagic
      @otegadamagic 4 года назад

      @@GregCorson wow! Thank you sooo much sir. This is has been more helpful than weeks of personal research. Thanks a whole lot sir.

    • @GregCorson
      @GregCorson  4 года назад +1

      Last bit of advise, if you are doing this for personal learning, almost any system will work with the limitations I already mentioned. However if you are planning on doing this professionally (ie: videos and stuff for MONEY) I'd strongly advise you not to go too cheap on any of your equipment. Even things like carts/mobile workstations, tripods...etc. A lot of inexpensive solutions can work, but the last thing you want is to have an expensive camera get wrecked because a cheap tripod broke, or to have computers/monitors get smashed because a caster on an equipment cart snapped.

  • @klausmoon9983
    @klausmoon9983 4 года назад

    Hi Greg, thanks for yours tutorials it has been really usefull to me. I am having a problem and maybe you can help me. I am using a blackmagic DeckLink Duo 2 for the camera in unreal. All works fine untill I generated the package for windows (.exe), then the new program can not detect the camera. If you can help I will be really grateful

    • @GregCorson
      @GregCorson  4 года назад

      I have not tried building an executable, so I'm not sure. I will see if I can test it out later this week.

  • @tjbx
    @tjbx 4 года назад

    Hi Greg!! I was wondering if you had any advice for or any intention on doing a tutorial surrounding multiple outputs and recording? Great tutorials as always!

    • @GregCorson
      @GregCorson  4 года назад

      The AJA video card I use is input only, so the only output(s) I have right now are from my graphics card. I haven't really experimented with outputting multiple videos. You might find some advise on this by joining the discord group linked in the lower right of my channel banner. As far as recording goes, I have successfully used OBS and the NVIDIA screen recorder software to record the output on the computer. I also have a Ninja V hardware HDMI recorder which can record my graphics card directly without any extra load on my computer. All these work pretty well.

  • @coubvideo8098
    @coubvideo8098 4 года назад

    THANKS!

  • @ngochoaitran3788
    @ngochoaitran3788 4 года назад

    Thanks Sir.