Nuke | Deep Compositing In Rise Of The Planet Of The Apes

Поделиться
HTML-код
  • Опубликовано: 4 окт 2024
  • КиноКино

Комментарии • 116

  • @vietnamnguyen1777
    @vietnamnguyen1777 Год назад +3

    I'm in 2023 and watching this video, I'm still learning these techniques 11 years ago, It's amazing

  • @bloodywolftr
    @bloodywolftr 8 лет назад +38

    this is some serious compositing.

  • @bigefx2
    @bigefx2 12 лет назад +1

    WOW! I'm a novice compositor to NukeX and seeing this demo on Deep Compositing just opened a whole new chapter in using this amazing creative tool. THANKS!!!

  • @FoundryTeam
    @FoundryTeam  12 лет назад +1

    Hi @sleek1978. Part 1 of 2:: Deep is different to a standard z-depth. z-depth gives you one sample for depth at a particular pixel - the first item it hits as the ray is cast into the scene.
    A deep image gives you multiple samples (colour values), per pixel, going back in depth. So for example if you have two objects, one in front of the other at a particular pixel then you'll get two samples, one for the object in front at a particular depth, and one for the object behind at its own depth.

  • @capoman1
    @capoman1 10 лет назад +3

    Thanks for the walk-through. It was awesome to hear from someone in a feature film, someone from a professional production facility. I came here not knowing what deep compositing was. Now I feel like I have a pretty good understanding of what it is, and what the benefits are. I even understand somewhat the workflow.
    Deep is basically the merging of the 2-D and 3-D world, letting us treat 2-D data in a 3-D way, including viewing a rudimentary 3-D preview of the scene without actually having to integrate any 3-D elements. Pretty awesome.
    One thing I have heard about deep is that it takes a ton of hard drive space. But just look at what becomes available to the compositor, the compositor doesn't have to really know a ton about 3-D and they are able to work with many 3-D elements in the scene and add rotoscoping and holdouts in the correct 3-D position in the scene, freaking awesome.
    It would have been nice if you would have turned on proxy mode so that the rendering would have been much faster for this walk-through. But thank you so much for sharing.
    I have to imagine that there was a ton of time rotoscoping there. I have to wonder if just creating the entire scene and 3-D wouldn't have been just as quick.

  • @plusminusonehalf
    @plusminusonehalf 12 лет назад +1

    I had a rough idea of what deep comp was but now it is pristine clear... very useful indeed... and go for that exploding bananas

  • @press_start_button
    @press_start_button 12 лет назад +2

    This is a really good explaination of how to use deep in compositing. I would like to get a real move with deep on next show. Thanks for share mate!

  • @QuiteDan
    @QuiteDan 12 лет назад +1

    I think the most impressive part is getting the damn fur to play well with all the advanced compositing passes.

  • @davidm.johnston8994
    @davidm.johnston8994 7 лет назад +1

    That is really mind blowing. I'm just starting to learn nuke at school and I can't wait to get a little more experienced with it. Thank you very much for this video, I love to have this insight into such a big movie production.

  • @AlDelcyCreative
    @AlDelcyCreative 11 лет назад +4

    that thing makes AE look like the iMovie of Compositing.

    • @xanzuls
      @xanzuls 3 месяца назад

      ae is for kids

  • @FoundryTeam
    @FoundryTeam  12 лет назад +1

    @sleek1978 Part 2 of 2: continued... This means you maintain control over how this data is dealt with right up to the comp stage. Say for example you decide you don't want the object in front you can simply mask it out (using a mask for a particular depth range), and you'll be able to see the object behind.
    Of course, as with all things cutting edge there is the trade off - in this case that your source files are bigger, but the extra control may make it worthwhile, depending on how you work.

  • @LairdSquared
    @LairdSquared 10 лет назад +1

    Very generous to share some of your vfx secrets... Great to see!

  • @FoundryTeam
    @FoundryTeam  12 лет назад

    We asked Robin and got this response: "Yes we had match moves for the cars, but they were pretty simple. Edge detail was not as good as we needed it to be, so we had to combine the matchmove deep with roto shapes (turned to deep). Also, there was no need to model every type of car on that bridge, so animation would have used a lot of stand in cars for similar vehicles. This added to the lack of detail."

  • @privateportall
    @privateportall 10 лет назад +1

    thx for the introduction. really awesome stuff

  • @Nickyreynolds
    @Nickyreynolds 11 лет назад

    The deep nodes are going to change the way of rotoscoping and compositing so much. Very useful tutorial!!

  • @IAMDIMITRI
    @IAMDIMITRI 5 лет назад

    So deep compositing is like an alpha channel for pixel depth in the image. Kinda cool !

  • @FoundryTeam
    @FoundryTeam  12 лет назад +1

    2 part reply from Robin: "If animation on an ape changed we had to re-run reflection and shadow passes, but they were very fast compared to doing an actual ape. We could render each shadow/reflection for each ape separately and deep merge them together. We only rendered a handful of shadow passes for one animation change." continued below...

    • @darioezequiel9855
      @darioezequiel9855 3 года назад

      i guess im asking randomly but does anyone know of a tool to log back into an Instagram account??
      I was dumb forgot my password. I would appreciate any assistance you can offer me!

    • @caspiannoe4338
      @caspiannoe4338 3 года назад

      @Dario Ezequiel instablaster =)

    • @darioezequiel9855
      @darioezequiel9855 3 года назад

      @Caspian Noe I really appreciate your reply. I found the site on google and im trying it out atm.
      Looks like it's gonna take a while so I will reply here later with my results.

    • @darioezequiel9855
      @darioezequiel9855 3 года назад

      @Caspian Noe it did the trick and I now got access to my account again. I'm so happy!
      Thank you so much you saved my account :D

    • @caspiannoe4338
      @caspiannoe4338 3 года назад

      @Dario Ezequiel you are welcome xD

  • @kuunami
    @kuunami 12 лет назад

    Great tutorial and very informative. The only bit of advice I'd like to offer is that you should figure out a compression method that will retain the detail and legibility of the text within the video. Even in 720p none of the onscreen text in the tutorial, names of nodes, menu text etc, is illegible.

  • @Meteotrance
    @Meteotrance 9 лет назад +4

    Thanks it's rare to see in detail, how pro work on blockbuster, but how the director and production peoples planed stuff like that, i mean for stereoscopic stuff or for dangerous stunt or explosion, or just subtle background change for periode film, and color grading, compositing is valuable, but it's like today movie is more and more 3D compositing and less less real shoot.

  • @AngelsNeverFade
    @AngelsNeverFade 12 лет назад

    Omg im studying VFX and i think this is soooo cool :) can't wait to get my hands on it

  • @Dechristian3
    @Dechristian3 11 лет назад

    This is excellent! Why haven't I heard of this!

  • @MrFelixdodd
    @MrFelixdodd 11 лет назад

    Surely better to have full 3d data in that scene than roto the cars? I can see the benefits of Proxy 3d data to help composting efficiency in future projects using this feature - awesome stuff.

  • @sleek1978
    @sleek1978 12 лет назад

    @TheFoundryChannel thank you very much ...very helpful answer

  • @plancton4058
    @plancton4058 2 года назад

    I don't know why but i watched the whole thing

  • @ipi223
    @ipi223 11 лет назад

    very interesting! thanx for sharing!

  • @deveshupadhyay
    @deveshupadhyay 7 лет назад

    Nice Info..Thanks man

  • @mortyexplains
    @mortyexplains 10 лет назад +3

    "an explosion of bah-nah-nahs" lolz

  • @GoDxism
    @GoDxism 9 лет назад +1

    Dammit. Time to learn Nukex

    • @laszlopapp2774
      @laszlopapp2774 2 года назад

      Deep compositing is actually also available in standard Nuke, not just NukeX.

  • @QuiteDan
    @QuiteDan 11 лет назад

    What's the next step in future compositing?
    Doopth passes?
    I think we already know the answer is yes.

  • @PotentStudios
    @PotentStudios 10 лет назад +1

    My mind is blown.

  • @mnoor8946
    @mnoor8946 11 лет назад

    this was useful tutorial , i wish if you can tell us about the fog that in the distance

  • @molgamus
    @molgamus 12 лет назад

    That makes a lot of sense. So I would imagine you used roto for the silhouette and the rendered deep data for depth culling within the car? Eg. the roof being further back than the hood.

    • @laszlopapp2774
      @laszlopapp2774 2 года назад

      I have a gut feeling that it was not so sophisticated, just plain roto with single depth for the cars.

  • @thuptengyurmey
    @thuptengyurmey 3 года назад

    Can we get the footage so that we can try and learn…. Plz! 🙏🏻

  • @RuiPedroSousaoficial
    @RuiPedroSousaoficial 8 лет назад +1

    Uouuu what an amazing experience it would be if it were possible for us to gain access to this shot for us to try Our compositing

    • @davidm.johnston8994
      @davidm.johnston8994 7 лет назад +1

      Rui Pedro Sousa That's my all time dream, to have access to big movie productions' assets and be able to play with them.

  • @harima40
    @harima40 11 лет назад

    Hi there! Awesome video!. One question though. at 35:50 you show the complete matte of the cars which we suppose is the result of all the rotos after placed in the correct depth in cards .But it looks like there is depth in each car. I am assuming there was some kind of multiplication of the rotos with a rough depth pass from the simple cg car geometry? and if that is the case how do you work it out without the depth of the proxy geo that its not always a standard in smaller productions.

  • @masterxeon1001
    @masterxeon1001 11 лет назад

    i dont even use nuke but wow. what a level of control you have with the depth. I still dont get how you got a z pass of the cars and they're not digital footage.

    • @Peter-ue4iz
      @Peter-ue4iz Год назад

      all footages are digital, meaning not analog, but you probably meant that cars are not 3D models.

  • @michaelandremovies
    @michaelandremovies Год назад

    What about car movement and car bonnet/roof denting when the monkeys are jumping on them?

  • @TheHurny
    @TheHurny 6 лет назад +1

    Whats with this video quality. Can't you do 1080?

  • @molgamus
    @molgamus 12 лет назад

    Great explanation of how to use deep data properly. When I first got notice of I thought it would only be useful for volumetric effects like smoke and clouds. But this video really shows how useful automatic matting is.
    But in what context was the apes animated in? Where there no low res blocked out cars in that scene? And if so, could you not ask your CG dept. to render a deep image of those? Would you still need to do the roto then? Could you not deep crop out certains part for CC then?

    • @laszlopapp2774
      @laszlopapp2774 2 года назад

      I think DeepCrop works with a bounding box, znear and zfar. Roto has more flexibility for the matte shape.

  • @miguelpress
    @miguelpress 11 лет назад

    @Shaun Fontaine Blender does compositing, I just stated I like freeware Im not comparing it

  • @bernd_the_almighty
    @bernd_the_almighty 12 лет назад

    cool channel. subbing.

  • @mohammadnemati8178
    @mohammadnemati8178 5 лет назад

    nice

  • @europaeio
    @europaeio 10 лет назад

    Hollander; Im sure youre not using a PC. But since I have one, I would like to know how much RAM should I have installed to be able to do what youre doing in the video. great presentation . Thanks

    • @enessensation
      @enessensation 9 лет назад

      12-16 GB should be enough for 1080p and standart 25 FPS

  • @MrHieuha
    @MrHieuha 10 лет назад

    thank !

  • @Ramt33n
    @Ramt33n 10 лет назад +3

    does it work with bears too? :D just kidding, have a question though, how do you integrate separate lighting elements into this workflow? say I have rendered my reflections, diffuse, gi, etc, into different channels of the deep exr. how can I shuffle them, process and add them back together?
    thank you :)

    • @jasonllapp
      @jasonllapp 10 лет назад

      HAHA
      :)

    • @vinchern
      @vinchern 10 лет назад

      I think one of the way is to render the deep and the regular exr separately, then use a DeepRecolor to combine them.

  • @bernd_the_almighty
    @bernd_the_almighty 12 лет назад

    I wonder why mess with depth etc. if you can basically do something like creating low poly versions of the cars and then having a node that would mask out everything that's behind that lowpoly mesh. Much less data needs to be stored. furthermore I think we can bake an algorythm that automatically creates lowpoly meshes by analyzing motion of pixel.

    • @laszlopapp2774
      @laszlopapp2774 2 года назад

      If you have 3D data available, sure, but it is not always the case. Also, some 2D artists might feel more comfortable with 2D processes rather than learning complete 3D workflows, I think.

  • @ady7d77
    @ady7d77 2 года назад

    Ram and CPU are using please.
    For I got pentium III and freezes

  • @1Poxxxx
    @1Poxxxx 10 лет назад +4

    how did u add the reflections?

    • @Kuk0san
      @Kuk0san 7 лет назад +1

      My guess would be that they are rendered separately out of Maya and he just comps them on top.

    • @ashishsomkuwar4700
      @ashishsomkuwar4700 7 лет назад +1

      it must have reflection pass rendered from 3d software, then in Nuke yo have to apply operation " Plus". Keep in mind for Lighting (where if they are in color RGB, they must have to "Plus" and where if we have pass where they are in Black and white for example- Ambient occlusion they usually have to "Multiply". hepe u will get that. best of luck for your studies.

  • @schmoborama
    @schmoborama 12 лет назад

    @hulllewis0817
    "How do you load mpeg,or avchd files into nuke"
    Best way is to export your video to an exr sequence. Nuke is not a encoder/decoder application, I wish they hadn't added the ability to import any video at all b/c now people (not saying you) will complain that it doesn't read the codec of their video or it doesn't work well enough.

  • @zche083
    @zche083 9 лет назад

    Topic is really good, but hard to follow.
    if you could give a introduction about whole structure set-up, then going to talk the details.

    • @oBCHANo
      @oBCHANo 8 лет назад

      +Jason Chen It's not hard to follow at all, clearly deep composting for a feature film is not a topic for beginners.

    • @zche083
      @zche083 8 лет назад

      oBLACKIECHANoo Could be

  • @MrGAS3D
    @MrGAS3D 12 лет назад

    Nice!
    One question. When you select a file in the DeepRead, which is it? I mean, is it the same as the RGBA or is it a different one, basically, is Deep another different RENDER ELEMENT, or is Nuke transforming and rgba into Deep?? A hope you can understand me...
    Thanks in advance!!

    • @laszlopapp2774
      @laszlopapp2774 2 года назад

      DeepRead takes a deep exr rendered. It does not convert traditional 2D EXR into a deep image. You can use DeepFromImage for that, albeit it will not be entirely the same as if you were getting a rendered deep exr in the first place.

  • @ToffepeerTH
    @ToffepeerTH 12 лет назад

    Does flame have something similar as this? I'd be interest to know :)

  • @schmoborama
    @schmoborama 12 лет назад

    @sleek1978
    "so the "Deep" node is actually Z-depth ...right?"
    Z-depth-zilla is more like it. As far as I understand it you can think of it in these ways :
    - It will store depth data for something that's completely behind something else.
    - A regular z-depth image is a 2D image - Deep is a 3D z-depth image.
    - Imagine you have a different z-depth image for every milimeter (or pixel) of distance from the camera.

  • @heraldfrancis
    @heraldfrancis 8 лет назад +10

    If you drink every time he says deep, you will be most definitely smashed :)

  • @sleek1978
    @sleek1978 12 лет назад

    so the "Deep" node is actually Z-depth ...right ? or is it something else ? im new to Nuke and i really appreciate the answer

    • @laszlopapp2774
      @laszlopapp2774 2 года назад

      It is a series of Z-depth blades (slices) making up the deep image. You have multiple samples per pixel as opposed to z-depth which has one sample per pixel, the closest object to the camera (smallest depth).

  • @Qwa7
    @Qwa7 12 лет назад

    360p ?((

  • @cubsin4
    @cubsin4 11 лет назад +1

    All the special effects in the world and a multimillion dollar budget and the studio put a commercial vehicle license plate on a car, the black Nissan Maxima, 6Z76299 CA.

  • @QuiteDan
    @QuiteDan 11 лет назад

    What is the size of a frame with Deep data?

    • @laszlopapp2774
      @laszlopapp2774 2 года назад

      It can be easily up to 150MB depending on the complexity of the frame, I would think.

  • @sahelanthrope
    @sahelanthrope 11 лет назад

    If you're going down that path, you'll be better off trying to bargain a seat for Autodesk Flame.

  • @amineroula
    @amineroula 12 лет назад

    the foundry nuke :)

  • @miguelpress
    @miguelpress 11 лет назад

    @Shaun Fontaine I am not professional so dats okay

  • @sandeepdigital1088
    @sandeepdigital1088 7 лет назад

    hallo sir mujhe nuke passes file kese online download kaete hai mujhe jan na hai plz sir replay me

  • @XxStopherLixX
    @XxStopherLixX 11 лет назад

    Yeah, I'd rather stick with AE. It will still get the job done nicely.

  • @espheroz
    @espheroz 9 лет назад

    can anybody the process of making this kind of film ?? this is all i know
    3d modeling -> rigingg -> texturing -> rendering -> export to nuke-> adding environment -> compositiong -> final??
    please add the missing parts or correct me if i am wrong

    • @VFXLtd
      @VFXLtd 8 лет назад

      +esp heroz final = rendering haha

    • @thomasip9938
      @thomasip9938 8 лет назад

      +esp heroz final = editing -> colour grading -> tweaking everything again -> rendering -> DONE

    • @laszlopapp2774
      @laszlopapp2774 2 года назад

      I am not sure if it required 3D modelling as that is kind of the point of Deep that you do not need 3D modelling in my opinion. Otherwise, you could just use that directly.

  • @rustytoe178
    @rustytoe178 12 лет назад

    Hi, NukeNoob here.
    I have a scene where the 3D object I have imported needs to pass behind an object in the video (e.g passing behind a lamppost).
    How would I go about doing this? would I have to mask/rotoscope like in this tutorial?

    • @laszlopapp2774
      @laszlopapp2774 2 года назад

      If you have true 3D data, you may not need to Roto (and Deep).

  • @LeighKrampeVFX
    @LeighKrampeVFX 10 лет назад

    Still confused of what 'deep' is

    • @Freelancerk1bbles
      @Freelancerk1bbles 10 лет назад

      uses a depth channel for calculating which pixel(each pixel) should be drawing in front of which other pixel.

  • @gamezone047
    @gamezone047 11 лет назад

    rly? so tell me about the advantages that this software has over the AE

  • @thejohn3791
    @thejohn3791 11 лет назад

    AE is mostly for CG effects, and really isnt for 3D editing like this software.

  • @shadowcasterstudios4042
    @shadowcasterstudios4042 2 года назад

    If it doesn't have a tail, its not a monkey,,, Even if it has a monkey Shape. if it doesent have a tail its not a monkey, and if its not a money, its an Ape

  • @bernd_the_almighty
    @bernd_the_almighty 11 лет назад

    Nuke is far more flexible than AE. AE is very clumsy on non-trivial tasks. You just get lost what layer something is on and you often need a lot of error-prone copypasta in AE compared to Nuke.

  • @Synicade
    @Synicade 12 лет назад

    Uhm, you realize Blender is not a fourth as advanced as Nuke, right?
    Blender only has EXTREMELY basic compositing.

  • @Mr_Dee
    @Mr_Dee 8 лет назад +3

    This goes deep! Thanks for the excellent explanation. Threat yourself to a banana:)

  • @bernd_the_almighty
    @bernd_the_almighty 12 лет назад

    so i mean something like 3d masks

  • @Horrorgraphy
    @Horrorgraphy 9 лет назад

    OMG...deep deep deep deep deep deep deep deep deep deep deep deep deep deep deep

    • @zche083
      @zche083 9 лет назад

      Horrorgraphy Very hard to follow,

    • @Horrorgraphy
      @Horrorgraphy 9 лет назад

      sorry, just annoy me.

  • @AhmadALderabany
    @AhmadALderabany 12 лет назад

    what name program u use ?

  • @rijve11
    @rijve11 12 лет назад

    can anybody tell me whats that software call??

  • @dannyelfilms
    @dannyelfilms 10 лет назад +1

    The Compositing capabilities of Cinema 4D is limited compared to Nuke LengendaryKidd

  • @miguelpress
    @miguelpress 12 лет назад

    I stay with Blender

  • @monocore
    @monocore 5 месяцев назад

    Oh geez this was 12 years ago, surely adobe has something similar by nHAHAHAHAHAHAHAH

  • @morgothFLOW
    @morgothFLOW 12 лет назад

    @bradchodges cars aren't that hard. Human faces are.

  • @gamezone047
    @gamezone047 11 лет назад

    wut ?